915 resultados para New methodology
Resumo:
Much research has focused on desertification and land degradation assessments without putting sufficient emphasis on prevention and mitigation, although the concept of sustainable land management (SLM) is increasingly being acknowledged. A variety of SLM measures have already been applied at the local level, but they are rarely adequately recognised, evaluated, shared or used for decision support. WOCAT (World Overview of Technologies and Approaches) has developed an internationally recognised, standardised methodology to document and evaluate SLM technologies and approaches, including spatial distribution, allowing the sharing of SLM knowledge worldwide. The recent methodological integration into a participatory process allows now analysing and using this knowledge for decision support at the local and national level. The use of the WOCAT tools stimulates evaluation (self-evaluation as well as learning from comparing experiences) within SLM initiatives where all too often there is not only insufficient monitoring but also a lack of critical analysis. The comprehensive questionnaires and database system facilitate to document, evaluate and disseminate local experiences of SLM technologies and their implementation approaches. This evaluation process - in a team of experts and together with land users - greatly enhances understanding of the reasons behind successful (or failed) local practices. It has now been integrated into a new methodology for appraising and selecting SLM options. The methodology combines a local collective learning and decision approach with the use of the evaluated global best practices from WOCAT in a concise three step process: i) identifying land degradation and locally applied solutions in a stakeholder learning workshop; ii) assessing local solutions with the standardised WOCAT tool; iii) jointly selecting promising strategies for implementation with the help of a decision support tool. The methodology has been implemented in various countries and study sites around the world mainly within the FAO LADA (Land Degradation Assessment Project) and the EU-funded DESIRE project. Investments in SLM must be carefully assessed and planned on the basis of properly documented experiences and evaluated impacts and benefits: concerted efforts are needed and sufficient resources must be mobilised to tap the wealth of knowledge and learn from SLM successes.
Resumo:
Proteomics describes, analogous to the term genomics, the study of the complete set of proteins present in a cell, organ, or organism at a given time. The genome tells us what could theoretically happen, whereas the proteome tells us what does happen. Therefore, a genomic-centered view of biologic processes is incomplete and does not describe what happens at the protein level. Proteomics is a relatively new methodology and is rapidly changing because of extensive advances in the underlying techniques. The core technologies of proteomics are 2-dimensional gel electrophoresis, liquid chromatography, and mass spectrometry. Proteomic approaches might help to close the gap between traditional pathophysiologic and more recent genomic studies, assisting our basic understanding of cardiovascular disease. The application of proteomics in cardiovascular medicine holds great promise. The analysis of tissue and plasma/serum specimens has the potential to provide unique information on the patient. Proteomics might therefore influence daily clinical practice, providing tools for diagnosis, defining the disease state, assessing of individual risk profiles, examining and/or screening of healthy relatives of patients, monitoring the course of the disease, determining the outcome, and setting up individual therapeutic strategies. Currently available clinical applications of proteomics are limited and focus mainly on cardiovascular biomarkers of chronic heart failure and myocardial ischemia. Larger clinical studies are required to test whether proteomics may have promising applications for clinical medicine. Cardiovascular surgeons should be aware of this increasingly pertinent and challenging field of science.
Resumo:
In recent years there has been a tremendous amount of research in the area of nanotechnology. History tells us that the commercialization of technologies will always be accompanied by both positive and negative effects for society and the environment. Products containing nanomaterials are already available in the market, and yet there is still not much information regarding the potential negative effects that these products may cause. The work presented in this dissertation describes a holistic approach to address different dimensions of nanotechnology sustainability. Life cycle analysis (LCA) was used to study the potential usage of polyethylene filled with nanomaterials to manufacture automobile body panels. Results showed that the nanocomposite does not provide an environmental benefit over traditional steel panels. A new methodology based on design of experiments (DOE) techniques, coupled with LCA, was implemented to investigate the impact of inventory uncertainties. Results showed that data variability does not have a significant effect on the prediction of the environmental impacts. Material profiles for input materials did have a highly significant effect on the overall impact. Energy consumption and material characterization were identified as two mainstreams where additional research is needed in order to predict the overall impact of nanomaterials more effectively. A study was undertaken to gain insights into the behavior of small particles in contact with a surface exposed to air flow to determine particle lift-off from the surface. A mapping strategy was implemented that allows for the identification of conditions for particle liftoff based on particle size and separation distance from the wall. Main results showed that particles smaller than 0:1mm will not become airborne under shear flow unless the separation distance is greater than 15 nm. Results may be used to minimize exposure to airborne materials. Societal implications that may occur in the workplace were researched. This research task explored different topics including health, ethics, and worker perception with the aim of identifying the base knowledge available in the literature. Recommendations are given for different scenarios to describe how workers and employers could minimize the unwanted effects of nanotechnology production.
Resumo:
Most desertification research focuses on degradation assessments without putting sufficient emphasis on prevention and mitigation strategies, although the concept of Sustainable Land Management (SLM) is increasingly being acknowledged. A variety of already applied conservation measures exist at the local level, but they are not adequately recognised, evaluated and shared, either by land users, technicians, researchers, or policy makers. Likewise, collaboration between research and implementation is often insufficient. The aim of this paper is to present a new methodology for a participatory process of appraising and selecting desertification mitigation strategies, and to present first experiences from its application in the EU-funded DESIRE project. The methodology combines a collective learning and decision approach with the use of evaluated global best practices. In three parts, it moves through a concise process, starting with identifying land degradation and locally applied solutions in a stakeholder workshop, leading to assessing local solutions with a standardised evaluation tool, and ending with jointly selecting promising strategies for implementation with the help of a decision support tool. The methodology is currently being applied in 16 study sites. Preliminary analysis from the application of the first part of the methodology shows that the initial stakeholder workshop results in a good basis for stakeholder cooperation, and in promising land conservation practices for further assessment. Study site research teams appreciated the valuable results, as burning issues and promising options emerged from joint reflection. The methodology is suitable to initiate mutual learning among different stakeholder groups and to integrate local and scientific knowledge.
Resumo:
Brain tumor is one of the most aggressive types of cancer in humans, with an estimated median survival time of 12 months and only 4% of the patients surviving more than 5 years after disease diagnosis. Until recently, brain tumor prognosis has been based only on clinical information such as tumor grade and patient age, but there are reports indicating that molecular profiling of gliomas can reveal subgroups of patients with distinct survival rates. We hypothesize that coupling molecular profiling of brain tumors with clinical information might improve predictions of patient survival time and, consequently, better guide future treatment decisions. In order to evaluate this hypothesis, the general goal of this research is to build models for survival prediction of glioma patients using DNA molecular profiles (U133 Affymetrix gene expression microarrays) along with clinical information. First, a predictive Random Forest model is built for binary outcomes (i.e. short vs. long-term survival) and a small subset of genes whose expression values can be used to predict survival time is selected. Following, a new statistical methodology is developed for predicting time-to-death outcomes using Bayesian ensemble trees. Due to a large heterogeneity observed within prognostic classes obtained by the Random Forest model, prediction can be improved by relating time-to-death with gene expression profile directly. We propose a Bayesian ensemble model for survival prediction which is appropriate for high-dimensional data such as gene expression data. Our approach is based on the ensemble "sum-of-trees" model which is flexible to incorporate additive and interaction effects between genes. We specify a fully Bayesian hierarchical approach and illustrate our methodology for the CPH, Weibull, and AFT survival models. We overcome the lack of conjugacy using a latent variable formulation to model the covariate effects which decreases computation time for model fitting. Also, our proposed models provides a model-free way to select important predictive prognostic markers based on controlling false discovery rates. We compare the performance of our methods with baseline reference survival methods and apply our methodology to an unpublished data set of brain tumor survival times and gene expression data, selecting genes potentially related to the development of the disease under study. A closing discussion compares results obtained by Random Forest and Bayesian ensemble methods under the biological/clinical perspectives and highlights the statistical advantages and disadvantages of the new methodology in the context of DNA microarray data analysis.
Resumo:
The 222Radon tracer method is a powerful tool to estimate local and regional surface emissions of, e.g., greenhouse gases. In this paper we demonstrate that in practice, the method as it is commonly used, produces inaccurate results in case of nonhomogeneously spread emission sources, and we propose a different approach to account for this. We have applied the new methodology to ambient observations of CO2 and 222Radon to estimate CO2 surface emissions for the city of Bern, Switzerland. Furthermore, by utilizing combined measurements of CO2 and δ(O2/N2) we obtain valuable information about the spatial and temporal variability of the main emission sources. Mean net CO2 emissions based on 2 years of observations are estimated at (11.2 ± 2.9) kt km−2 a−1. Oxidative ratios indicate a significant influence from the regional biosphere in summer/spring and fossil fuel combustion processes in winter/autumn. Our data indicate that the emissions from fossil fuels are, to a large degree, related to the combustion of natural gas which is used for heating purposes.
Resumo:
Many rehabilitation robots use electric motors with gears. The backdrivability of geared drives is poor due to friction. While it is common practice to use velocity measurements to compensate for kinetic friction, breakaway friction usually cannot be compensated for without the use of an additional force sensor that directly measures the interaction force between the human and the robot. Therefore, in robots without force sensors, subjects must overcome a large breakaway torque to initiate user-driven movements, which are important for motor learning. In this technical note, a new methodology to compensate for both kinetic and breakaway friction is presented. The basic strategy is to take advantage of the fact that, for rehabilitation exercises, the direction of the desired motion is often known. By applying the new method to three implementation examples, including drives with gear reduction ratios 100-435, the peak breakaway torque could be reduced by 60-80%.
Resumo:
We used real-time LDI to study regional variations in microcirculatory perfusion in healthy candidates to establish a new methodology for global perfusion body mapping that is based on intra-individual perfusion index ratios. Our study included 74 (37 female) healthy volunteers aged between 22 and 30 years (mean 24.49). Imaging was performed using a recent microcirculation-imaging camera (EasyLDI) for different body regions of each volunteer. The perfusion values were reported in Arbitrary Perfusion Units (APU). The relative perfusion indexes for each candidate's body region were then obtained by normalization with the perfusion value of the forehead. Basic parameters such as weight, height, and blood pressure were also measured and analyzed. The highest mean perfusion value was reported in the forehead area (259.21APU). Mean perfusion in the measured parts of the body correlated positively with mean forehead value, while there was no significant correlation between forehead blood perfusion values and room temperature, BMI, systolic blood pressure and diastolic blood pressure (p=0.420, 0.623, 0.488, 0.099, respectively). Analysis of the data showed that perfusion indexes were not significantly different between male and female volunteers except for the ventral upper arm area (p=.001). LDI is a non-invasive, fast technique that opens several avenues for clinical applications. The mean perfusion indexes are useful in clinical practice for monitoring patients before and after surgical interventions. Perfusion values can be predicted for different body parts for patients only by taking the forehead perfusion value and using the perfusion index ratios to obtain expected normative perfusion values.
Resumo:
A new methodology based on combining active and passive remote sensing and simultaneous and collocated radiosounding data to study the aerosol hygroscopic growth effects on the particle optical and microphysical properties is presented. The identification of hygroscopic growth situations combines the analysis of multispectral aerosol particle backscatter coefficient and particle linear depolarization ratio with thermodynamic profiling of the atmospheric column. We analyzed the hygroscopic growth effects on aerosol properties, namely the aerosol particle backscatter coefficient and the volume concentration profiles, using data gathered at Granada EARLINET station. Two study cases, corresponding to different aerosol loads and different aerosol types, are used for illustrating the potential of this methodology. Values of the aerosol particle backscatter coefficient enhancement factors range from 2.1 ± 0.8 to 3.9 ± 1.5, in the ranges of relative humidity 60–90 and 40–83%, being similar to those previously reported in the literature. Differences in the enhancement factor are directly linked to the composition of the atmospheric aerosol. The largest value of the aerosol particle backscatter coefficient enhancement factor corresponds to the presence of sulphate and marine particles that are more affected by hygroscopic growth. On the contrary, the lowest value of the enhancement factor corresponds to an aerosol mixture containing sulphates and slight traces of mineral dust. The Hänel parameterization is applied to these case studies, obtaining results within the range of values reported in previous studies, with values of the γ exponent of 0.56 ± 0.01 (for anthropogenic particles slightly influenced by mineral dust) and 1.07 ± 0.01 (for the situation dominated by anthropogenic particles), showing the convenience of this remote sensing approach for the study of hygroscopic effects of the atmospheric aerosol under ambient unperturbed conditions. For the first time, the retrieval of the volume concentration profiles for these cases using the Lidar Radiometer Inversion Code (LIRIC) allows us to analyze the aerosol hygroscopic growth effects on aerosol volume concentration, observing a stronger increase of the fine mode volume concentration with increasing relative humidity.
Resumo:
BACKGROUND AND PURPOSE (99)TC combined with blue-dye mapping is considered the best sentinel lymph node (SLN) mapping technique in cervical cancer. Indocyanine green (ICG) with near infrared fluorescence imaging has been introduced as a new methodology for SLN mapping. The aim of this study was to compare these two techniques in the laparoscopic treatment of cervical cancer. METHODS Medical records of patients undergoing laparoscopic SLN mapping for cervical cancer with either (99)Tc and patent blue dye (Group 1) or ICG (Group 2) from April 2008 until August 2012 were reviewed. Sensitivity, specificity, and overall and bilateral detection rates were calculated and compared. RESULTS Fifty-eight patients were included in the study-36 patients in Group 1 and 22 patients in Group 2. Median tumor diameter was 25 and 29 mm, and mean SLN count was 2.1 and 3.7, for Groups 1 and 2, respectively. Mean non-SLN (NSLN) count was 39 for both groups. SLNs were ninefold more likely to be affected by metastatic disease compared with NSLNs (p < 0.005). Sensitivity and specificity were both 100 %. Overall detection rates were 83 and 95.5 % (p = nonsignificant), and bilateral detection rates were 61 and 95.5 % (p < 0.005), for Groups 1 and 2, respectively. In 75 % of cases, SLNs were located along the external or internal iliac nodal basins. CONCLUSIONS ICG SLN mapping in cervical cancer provides high overall and bilateral detection rates that compare favorably with the current standard of care.
Resumo:
Uno de los principales problemas que se presenta para la investigación en césped acerca de la adaptabilidad de diferentes especies y sus mezclas o la introducción de otras nuevas, consiste en la dificultad de calcular la densidad de plantas, en forma lo suficientemente aproximada a los resultados reales como para permitir comparaciones estadísticamente válidas de la calidad o usos del cultivo resultante. Se propone una nueva metodología para reemplazar aquella tradicional de cálculo de necesidades de semilla por peso, con densidades determinadas por prueba y error según el resultado final por aspecto y corrección de cantidad de mezcla sólo por valor cultural. La propuesta considera cálculos matemáticos para determinar la cantidad exacta de semillas a emplear para obtener una densidad final de siembra controlada, teniendo en cuenta peso específico de la semilla y valor cultural, afectando el dato resultante por un factor de corrección (CN) que relaciona el valor teórico obtenido con la respuesta a campo. Es decir que la cantidad de semilla a emplear debe calcularse en función de la densidad de plantas deseada. En el presente trabajo el factor CN se determina para diferentes especies. Se parte de la hipótesis que es posible mejorar las formas de selección de especies y preparación de mezclas a través del cálculo de la necesidad de semillas por cm2 y que algunas de las especies utilizadas poseen baja eficiencia en la relación semilla-planta, por lo que se debería ajustar su proporción en las mezclas comerciales locales según este factor de cultivo. Como metodología se propone: a. análisis de poder germinativo, pureza y peso específico de semillas comerciales; b. con cuatro repeticiones, en bloques al azar, realizar siembras individuales de las especies en condiciones ideales de sustrato, iluminación y humedad, y siembra en las mismas condiciones de campo que un cultivo tradicional en condiciones de especie pura y consociadas (mezclas); c. conteo del número de plántulas obtenidas en cada caso por método propuesto por Lush y Franz; d. con los resultados, estimar los coeficientes (CN) promedio y sus respectivos intervalos de confianza. Esta nueva manera de calcular las cantidades exactas de cada componente de una mezcla para césped abre un campo muy importante a la investigación de especies más adecuadas a cada ambiente, dado que el método tradicional conduce a resultados muy ambiguos y de difícil comparación.
Resumo:
Este trabajo se realizó en el contexto del Plan de estudios que la Facultad de Ciencias Agrarias (UNCuyo) implementó a partir de 1994 para las carreras de Ingeniería Agronómica y Licenciatura en Bromatología. Se calcularon algunos indicadores educativos (Tasa de Aprobación, Tasa de Deserción, Tasa de Recursado y Tasa de Permanencia) con el objeto de efectuar un seguimiento en la aplicación de dicho Plan de estudios, a través de una nueva metología. Para el análisis de los datos se utilizaron cuatro métodos: "Determinación de distancias entre matrices y sus respectivos valores ideales", que permitió elaborar un índice de lejanía del sistema en su conjunto; "Análisis de Componentes Principales (ACP)" seguido de un "Análisis de Conglomerados (AC)", que permitieron agrupar las materias de acuerdo con sus similitudes. Finalmente, se realizó un "Análisis Discriminante", que permitió concluir que las tasas calculadas no establecen diferencias entre las asignaturas de los ciclos Básico e Instrumental (únicamente en el caso de Ingeniería Agronómica).
Resumo:
The purpose of this study was the estimation of current and potential water erosion rates in Castellon Province (Spain) using RUSLE3D (Revised Universal Soil Loss Equation-3D) model with Geographical Information System (GIS) support. RUSLE3D uses a new methodology for topographic factor estimation (LS factor) based on the impact of flow convergence allowing better assessment of sediment distribution detached by water erosion. In RUSLE3D equation, the effect that vegetation cover has on soil erosion rate is reflected by the C factor. Potential erosion indicates soil erosion rate without considering C factor in RUSLE3D equation. The results showed that 57% of estimated current erosion does not exceed 10 t/ha.year (low erosion). In the case of potential erosion rates, 5% of the area of Castellon Province does not exceed 10 t/ha.year but 55% exceed 200 t/ha.year. Based on these results, the current vegetation cover of Castellon Province is adequate but needs to be conserved to avoid an increase in the current soil erosion rates as shown by potential erosion rates.
Resumo:
En el año 2000, los gobiernos de 189 países se comprometieron a alcanzar dieciocho metas para combatir la desigualdad y mejorar el desarrollo humano en el mundo a través del cumplimiento de los denominados Objetivos de Desarrollo del Milenio (ODM) de la Organización de las Naciones Unidas (ONU), todo ello con un horizonte para el año 2015. Los ODM deben ser monitoreados por los institutos de estadística existentes en la mayoría de los países que calculan indicadores económicos. En la Argentina, el Instituto Nacional de Estadística y Censos tiene una larga experiencia en la creación y cálculo de índices, específicamente, del índice de Precios al Consumidor el cual ha sido considerado como el mejor índice de precios de toda América Latina. Sin embargo, desde el año 2007 se ha implementado una nueva metodología para su cálculo. En esta comunicación se presentan resultados de trabajos realizados desde diferentes puntos de vista que aportan los métodos exploratorios y confirmatorios tendientes a conocer la situación de pobreza e indigencia y la distribución de los ingresos de los habitantes del Aglomerado Gran Rosario. Los análisis exploratorios y confirmatorios aplicados presentan incuestionables resultados que reflejan la coherencia de las conclusiones a que se ha arribado. Los datos primarios de calidad utilizados permitieron plasmar análisis dinámicos con la información homogénea disponible. Finalmente y a la luz de estas circunstancias, ¿se podrá saber fehacientemente si se alcanzarán los Objetivos de Desarrollo del Milenio en el año 2015?
Resumo:
En el año 2000, los gobiernos de 189 países se comprometieron a alcanzar dieciocho metas para combatir la desigualdad y mejorar el desarrollo humano en el mundo a través del cumplimiento de los denominados Objetivos de Desarrollo del Milenio (ODM) de la Organización de las Naciones Unidas (ONU), todo ello con un horizonte para el año 2015. Los ODM deben ser monitoreados por los institutos de estadística existentes en la mayoría de los países que calculan indicadores económicos. En la Argentina, el Instituto Nacional de Estadística y Censos tiene una larga experiencia en la creación y cálculo de índices, específicamente, del índice de Precios al Consumidor el cual ha sido considerado como el mejor índice de precios de toda América Latina. Sin embargo, desde el año 2007 se ha implementado una nueva metodología para su cálculo. En esta comunicación se presentan resultados de trabajos realizados desde diferentes puntos de vista que aportan los métodos exploratorios y confirmatorios tendientes a conocer la situación de pobreza e indigencia y la distribución de los ingresos de los habitantes del Aglomerado Gran Rosario. Los análisis exploratorios y confirmatorios aplicados presentan incuestionables resultados que reflejan la coherencia de las conclusiones a que se ha arribado. Los datos primarios de calidad utilizados permitieron plasmar análisis dinámicos con la información homogénea disponible. Finalmente y a la luz de estas circunstancias, ¿se podrá saber fehacientemente si se alcanzarán los Objetivos de Desarrollo del Milenio en el año 2015?