933 resultados para Non Parametric Methodology


Relevância:

80.00% 80.00%

Publicador:

Resumo:

La fisuración iniciada en la superficie de los pavimentos asfálticos constituye uno de los más frecuentes e importantes modos de deterioro que tienen lugar en los firmes bituminosos, como han demostrado los estudios teóricos y experimentales llevados a cabo en la última década. Sin embargo, este mecanismo de fallo no ha sido considerado por los métodos tradicionales de diseño de estos firmes. El concepto de firmes de larga duración se fundamenta en un adecuado seguimiento del proceso de avance en profundidad de estos deterioros y la intervención en el momento más apropiado para conseguir mantenerlos confinados como fisuras de profundidad parcial en la capa superficial más fácilmente accesible y reparable, de manera que pueda prolongarse la durabilidad y funcionalidad del firme y reducir los costes generalizados de su ciclo de vida. Por lo tanto, para la selección de la estrategia óptima de conservación de los firmes resulta esencial disponer de metodologías que posibiliten la identificación precisa in situ de la fisuración descendente, su seguimiento y control, y que además permitan una determinación fiable y con alto rendimiento de su profundidad y extensión. En esta Tesis Doctoral se presentan los resultados obtenidos mediante la investigación sistemática de laboratorio e in situ llevada a cabo para la obtención de datos sobre fisuración descendente en firmes asfálticos y para el estudio de procedimientos de evaluación de la profundidad de este tipo de fisuras empleando técnicas de ultrasonidos. Dichos resultados han permitido comprobar que la metodología no destructiva propuesta, de rápida ejecución, bajo coste y sencilla implementación (principalmente empleada hasta el momento en estructuras metálicas y de hormigón, debido a las dificultades que introduce la naturaleza viscoelástica de los materiales bituminosos) puede ser aplicada con suficiente fiabilidad y repetibilidad sobre firmes asfálticos. Las medidas resultan asimismo independientes del espesor total del firme. Además, permite resolver algunos de los inconvenientes frecuentes que presentan otros métodos de diagnóstico de las fisuras de pavimentos, tales como la extracción de testigos (sistema destructivo, de alto coste y prolongados tiempos de interrupción del tráfico) o algunas otras técnicas no destructivas como las basadas en medidas de deflexiones o el georradar, las cuales no resultan suficientemente precisas para la investigación de fisuras superficiales. Para ello se han realizado varias campañas de ensayos sobre probetas de laboratorio en las que se han estudiado diferentes condiciones empíricas como, por ejemplo, distintos tipos de mezclas bituminosas en caliente (AC, SMA y PA), espesores de firme y adherencias entre capas, temperaturas, texturas superficiales, materiales de relleno y agua en el interior de las grietas, posición de los sensores y un amplio rango de posibles profundidades de fisura. Los métodos empleados se basan en la realización de varias medidas de velocidad o de tiempo de transmisión del pulso ultrasónico sobre una única cara o superficie accesible del material, de manera que resulte posible obtener un coeficiente de transmisión de la señal (mediciones relativas o autocompensadas). Las mediciones se han realizado a bajas frecuencias de excitación mediante dos equipos de ultrasonidos diferentes dotados, en un caso, de transductores de contacto puntual seco (DPC) y siendo en el otro instrumento de contacto plano a través de un material especialmente seleccionado para el acoplamiento (CPC). Ello ha permitido superar algunos de los tradicionales inconvenientes que presenta el uso de los transductores convencionales y no precisar preparación previa de las superficies. La técnica de autocalibración empleada elimina los errores sistemáticos y la necesidad de una calibración local previa, demostrando el potencial de esta tecnología. Los resultados experimentales han sido comparados con modelos teóricos simplificados que simulan la propagación de las ondas ultrasónicas en estos materiales bituminosos fisurados, los cuales han sido deducidos previamente mediante un planteamiento analítico y han permitido la correcta interpretación de dichos datos empíricos. Posteriormente, estos modelos se han calibrado mediante los resultados de laboratorio, proporcionándose sus expresiones matemáticas generalizadas y gráficas para su uso rutinario en las aplicaciones prácticas. Mediante los ensayos con ultrasonidos efectuados en campañas llevadas a cabo in situ, acompañados de la extracción de testigos del firme, se han podido evaluar los modelos propuestos. El máximo error relativo promedio en la estimación de la profundidad de las fisuras al aplicar dichos modelos no ha superado el 13%, con un nivel de confianza del 95%, en el conjunto de todos los ensayos realizados. La comprobación in situ de los modelos ha permitido establecer los criterios y las necesarias recomendaciones para su utilización sobre firmes en servicio. La experiencia obtenida posibilita la integración de esta metodología entre las técnicas de auscultación para la gestión de su conservación. Abstract Surface-initiated cracking of asphalt pavements constitutes one of the most frequent and important types of distress that occur in flexible bituminous pavements, as clearly has been demonstrated in the technical and experimental studies done over the past decade. However, this failure mechanism has not been taken into consideration for traditional methods of flexible pavement design. The concept of long-lasting pavements is based on adequate monitoring of the depth and extent of these deteriorations and on intervention at the most appropriate moment so as to contain them in the surface layer in the form of easily-accessible and repairable partial-depth topdown cracks, thereby prolonging the durability and serviceability of the pavement and reducing the overall cost of its life cycle. Therefore, to select the optimal maintenance strategy for perpetual pavements, it becomes essential to have access to methodologies that enable precise on-site identification, monitoring and control of top-down propagated cracks and that also permit a reliable, high-performance determination of the extent and depth of cracking. This PhD Thesis presents the results of systematic laboratory and in situ research carried out to obtain information about top-down cracking in asphalt pavements and to study methods of depth evaluation of this type of cracking using ultrasonic techniques. These results have demonstrated that the proposed non-destructive methodology –cost-effective, fast and easy-to-implement– (mainly used to date for concrete and metal structures, due to the difficulties caused by the viscoelastic nature of bituminous materials) can be applied with sufficient reliability and repeatability to asphalt pavements. Measurements are also independent of the asphalt thickness. Furthermore, it resolves some of the common inconveniences presented by other methods used to evaluate pavement cracking, such as core extraction (a destructive and expensive procedure that requires prolonged traffic interruptions) and other non-destructive techniques, such as those based on deflection measurements or ground-penetrating radar, which are not sufficiently precise to measure surface cracks. To obtain these results, extensive tests were performed on laboratory specimens. Different empirical conditions were studied, such as various types of hot bituminous mixtures (AC, SMA and PA), differing thicknesses of asphalt and adhesions between layers, varied temperatures, surface textures, filling materials and water within the crack, different sensor positions, as well as an ample range of possible crack depths. The methods employed in the study are based on a series of measurements of ultrasonic pulse velocities or transmission times over a single accessible side or surface of the material that make it possible to obtain a signal transmission coefficient (relative or auto-calibrated readings). Measurements were taken at low frequencies by two short-pulse ultrasonic devices: one equipped with dry point contact transducers (DPC) and the other with flat contact transducers that require a specially-selected coupling material (CPC). In this way, some of the traditional inconveniences presented by the use of conventional transducers were overcome and a prior preparation of the surfaces was not required. The auto-compensating technique eliminated systematic errors and the need for previous local calibration, demonstrating the potential for this technology. The experimental results have been compared with simplified theoretical models that simulate ultrasonic wave propagation in cracked bituminous materials, which had been previously deduced using an analytical approach and have permitted the correct interpretation of the aforementioned empirical results. These models were subsequently calibrated using the laboratory results, providing generalized mathematical expressions and graphics for routine use in practical applications. Through a series of on-site ultrasound test campaigns, accompanied by asphalt core extraction, it was possible to evaluate the proposed models, with differences between predicted crack depths and those measured in situ lower than 13% (with a confidence level of 95%). Thereby, the criteria and the necessary recommendations for their implementation on in-service asphalt pavements have been established. The experience obtained through this study makes it possible to integrate this methodology into the evaluation techniques for pavement management systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Here, a novel and efficient strategy for moving object detection by non-parametric modeling on smart cameras is presented. Whereas the background is modeled using only color information, the foreground model combines color and spatial information. The application of a particle filter allows the update of the spatial information and provides a priori information about the areas to analyze in the following images, enabling an important reduction in the computational requirements and improving the segmentation results

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Systems biology techniques are a topic of recent interest within the neurological field. Computational intelligence (CI) addresses this holistic perspective by means of consensus or ensemble techniques ultimately capable of uncovering new and relevant findings. In this paper, we propose the application of a CI approach based on ensemble Bayesian network classifiers and multivariate feature subset selection to induce probabilistic dependences that could match or unveil biological relationships. The research focuses on the analysis of high-throughput Alzheimer's disease (AD) transcript profiling. The analysis is conducted from two perspectives. First, we compare the expression profiles of hippocampus subregion entorhinal cortex (EC) samples of AD patients and controls. Second, we use the ensemble approach to study four types of samples: EC and dentate gyrus (DG) samples from both patients and controls. Results disclose transcript interaction networks with remarkable structures and genes not directly related to AD by previous studies. The ensemble is able to identify a variety of transcripts that play key roles in other neurological pathologies. Classical statistical assessment by means of non-parametric tests confirms the relevance of the majority of the transcripts. The ensemble approach pinpoints key metabolic mechanisms that could lead to new findings in the pathogenesis and development of AD

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prevalence of vitamin B12 deficiency is very common in elderly people and can reach values as high as 40.5% of the population. It can be the result of the interaction among several factors. Vitamin B12 deficiencies have been associated with neurological, cognitive deterioration, haematological abnormalities and cardiovascular diseases that have an important influence on the health of the elderly and their quality of life. It is necessary to approach the problems arisen from the lack of data relative to them. The main objective of this thesis was to analyse the evolution of vitamin B12 status and related parameters, lipid and haematological profiles and their relationship to health risk factors, and to functional and cognitive status over one year and to determine the effect of an oral supplementation of 500 μg of cyanocobalamin for a short period of 28 days. An additional objective was to analyze the possible effects of medicine intakes on vitamin B status. Three studies were performed: a) a one year longitudinal follow-up with four measure points; b) an intervention study providing an oral liquid supplement of 500 μg of cyanocobalamin for a 28 days period; and c) analysis of the possible effect of medication intake on vitamin B status using the ATC classification of medicines. The participants for these studies were recruited from nursing homes for the elderly in the Region of Madrid. Sixty elders (mean age 84 _ 7y, 19 men and 41 women) were recruited for Study I and 64 elders (mean age 82 _ 7y, 24 men and 40 women) for Study II. For Study III, baseline data from the initially recruited participants of the first two studies were used. An informed consent was obtained from all participants or their mentors. The studies were approved by the Ethical Committee of the University of Granada. Blood samples were obtained at each examination date and were analyzed for serum cobalamin, holoTC, serum and RBC folate and total homocysteine according to laboratory standard procedures. The haematological parameters analyzed were haematocrit, haemoglobin and MCV. For the lipid profile TG, total cholesterol, LDL- and HDLcholesterol were analyzed. Anthropometric measures (BMI, skinfolds [triceps and subscapular], waist girth and waist to hip ratio), functional tests (hand grip, arm and leg strength tests, static balance) and MMSE were obtained or administered by trained personal. The vitamin B12 supplement of Study II was administered with breakfast and the medication intake was taken from the residents’ anamnesis. Data were analyzed by parametric and non-parametric statistics depending on the obtained data. Comparisons were done using the appropriate ANOVAs or non-parametric tests. Pearsons’ partial correlations with the variable “time” as control were used to define the association of the analyzed parameters. XIII The results showed that: A) Over one year, in relationship to vitamin B status, serum cobalamin decreased, serum folate and mean corpuscular volumen increased significantly and total homocysteine concentrations were stable. Regarding blood lipid profile, triglycerides increased and HDL-cholesterol decreased significantly. Regarding selected anthropometric measurements, waist circumference increased significantly. No significant changes were observed for the rest of parameters. B) Prevalence of hyperhomocysteinemia was high in the elderly studied, ranging from 60% to 90 % over the year depending on the cut-off used for the classification. LDL-cholesterol values were high, especially among women, and showed a tendency to increase over the year. Results of the balance test showed a deficiency and a tendency to decrease; this indicates that the population studied is at high risk for falls. Lower extremity muscular function was deficient and showed a tendency to decrease. A highly significant relationship was observed between the skinfold of the triceps and blood lipid profile. C) Low cobalamin concentrations correlated significantly with low MMSE scores in the elderly studied. No correlations were observed between vitamin B12 status and functional parameters. D) Regarding vitamin B12 status, holo-transcobalamin seems to be more sensitive for diagnosis; 5-10% of the elderly had a deficiency using serum cobalamin as a criterion, and 45-52% had a deficiency when using serum holotranscobalamin as a criterion. E) 500 μg of cyanocobalamin administered orally during 28 days significantly improved vitamin B12 status and significantly decreased total homocysteine concentrations in institutionalized elderly. No effect of the intervention was observed on functional and cognitive parameters. F) The relative change (%) of improvement of vitamin B12 status was higher when using serum holo-transcobalamin as a criterion than serum cobalamin. G) Antiaenemic drug intake normalized cobalamin, urologic drugs and corticosteroids serum folate, and psychoanaleptics holo-transcobalamin levels. Drugs treating pulmonary obstruction increased total homocysteine concentration significantly. H) The daily mean drug intake was 5.1. Fiftynine percent of the elderly took medication belonging to 5 or more different ATC groups. The most prevalent were psycholeptic (53%), antiacid (53%) and antithrombotic (47%) drugs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper introduces a semantic language developed with the objective to be used in a semantic analyzer based on linguistic and world knowledge. Linguistic knowledge is provided by a Combinatorial Dictionary and several sets of rules. Extra-linguistic information is stored in an Ontology. The meaning of the text is represented by means of a series of RDF-type triples of the form predicate (subject, object). Semantic analyzer is one of the options of the multifunctional ETAP-3 linguistic processor. The analyzer can be used for Information Extraction and Question Answering. We describe semantic representation of expressions that provide an assessment of the number of objects involved and/or give a quantitative evaluation of different types of attributes. We focus on the following aspects: 1) parametric and non-parametric attributes; 2) gradable and non-gradable attributes; 3) ontological representation of different classes of attributes; 4) absolute and relative quantitative assessment; 5) punctual and interval quantitative assessment; 6) intervals with precise and fuzzy boundaries

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Non-parametric belief propagation (NBP) is a well-known message passing method for cooperative localization in wireless networks. However, due to the over-counting problem in the networks with loops, NBP’s convergence is not guaranteed, and its estimates are typically less accurate. One solution for this problem is non-parametric generalized belief propagation based on junction tree. However, this method is intractable in large-scale networks due to the high-complexity of the junction tree formation, and the high-dimensionality of the particles. Therefore, in this article, we propose the non-parametric generalized belief propagation based on pseudo-junction tree (NGBP-PJT). The main difference comparing with the standard method is the formation of pseudo-junction tree, which represents the approximated junction tree based on thin graph. In addition, in order to decrease the number of high-dimensional particles, we use more informative importance density function, and reduce the dimensionality of the messages. As by-product, we also propose NBP based on thin graph (NBP-TG), a cheaper variant of NBP, which runs on the same graph as NGBP-PJT. According to our simulation and experimental results, NGBP-PJT method outperforms NBP and NBP-TG in terms of accuracy, computational, and communication cost in reasonably sized networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cognitive wireless sensor network (CWSN) is a new paradigm, integrating cognitive features in traditional wireless sensor networks (WSNs) to mitigate important problems such as spectrum occupancy. Security in cognitive wireless sensor networks is an important problem since these kinds of networks manage critical applications and data. The specific constraints of WSN make the problem even more critical, and effective solutions have not yet been implemented. Primary user emulation (PUE) attack is the most studied specific attack deriving from new cognitive features. This work discusses a new approach, based on anomaly behavior detection and collaboration, to detect the primary user emulation attack in CWSN scenarios. Two non-parametric algorithms, suitable for low-resource networks like CWSNs, have been used in this work: the cumulative sum and data clustering algorithms. The comparison is based on some characteristics such as detection delay, learning time, scalability, resources, and scenario dependency. The algorithms have been tested using a cognitive simulator that provides important results in this area. Both algorithms have shown to be valid in order to detect PUE attacks, reaching a detection rate of 99% and less than 1% of false positives using collaboration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A spatial-color-based non-parametric background-foreground modeling strategy in a GPGPU by using CUDA is proposed. This strategy is suitable for augmented-reality applications, providing real-time high-quality results in a great variety of scenarios.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The last generation of consumer electronic devices is endowed with Augmented Reality (AR) tools. These tools require moving object detection strategies, which should be fast and efficient, to carry out higher level object analysis tasks. We propose a lightweight spatio-temporal-based non-parametric background-foreground modeling strategy in a General Purpose Graphics Processing Unit (GPGPU), which provides real-time high-quality results in a great variety of scenarios and is suitable for AR applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mixtures of polynomials (MoPs) are a non-parametric density estimation technique especially designed for hybrid Bayesian networks with continuous and discrete variables. Algorithms to learn one- and multi-dimensional (marginal) MoPs from data have recently been proposed. In this paper we introduce two methods for learning MoP approximations of conditional densities from data. Both approaches are based on learning MoP approximations of the joint density and the marginal density of the conditioning variables, but they differ as to how the MoP approximation of the quotient of the two densities is found. We illustrate and study the methods using data sampled from known parametric distributions, and we demonstrate their applicability by learning models based on real neuroscience data. Finally, we compare the performance of the proposed methods with an approach for learning mixtures of truncated basis functions (MoTBFs). The empirical results show that the proposed methods generally yield models that are comparable to or significantly better than those found using the MoTBF-based method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El presente artículo pretende describir el desarrollo de una nueva metodología no invasiva de documentación digital de petroglifos y pinturas rupestres pertenecientes al paleolítico, a través de técnicas y herramientas del tratamiento digital de imágenes para optimizar materiales y tiempos en la obtención de información gráfica, representativa y de precisión. Abstract: This article aims to describe the development of a new non-invasive methodology, through techniques and tools of digital image processing to optimize materials and time in obtaining graphical representative and accurate information from petroglyphs and rock paintings belonging to Paleolithic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este estudio pretende estimar la eficiencia y la productividad de las principales provincias de la producción de trigo en Egipto. Los datos utilizados en este estudio son datos de panel a nivel de provincias del período 1990-2012, obtenidos del Ministerio de Agricultura y Recuperación Tierras, y de la Agencia Central de Movilización Pública y Estadística, Egipto. Se aplica el enfoque de fronteras estocásticas para medir la eficiencia (función de producción de Cobb-Douglas) y se emplean las especificaciones de Battese y Coelli (1992) y (1995). También se utiliza el índice de Malmquist como una aproximación no paramétrica (Análisis de Envolvente de Datos) para descomponer la productividad total de los factores de las principales provincias productoras de trigo en Egipto en cambio técnico y cambio de eficiencia. El coeficiente de tierra es positivo y significativo en los dos especificaciones Battese y Coelli (1992) y (1995), lo que implica que aumentar la tierra para este cultivo aumentaría significativamente la producción de trigo. El coeficiente de trabajo es positivo y significativo en la especificación de Battese y Coelli (1992), mientras que es positivo y no significativo en la especificación de Battese y Coelli (1995). El coeficiente de la maquinaria es negativo y no significativo en las dos especificaciones de Battese y Coelli (1992) y (1995). El coeficiente de cambio técnico es positivo y no significativo en la especificación de Battese y Coelli (1992), mientras que es positiva y significativo en la especificación de Battese y Coelli (1995). Las variables de efectos del modelo de ineficiencia Battese y Coelli (1995) indican que no existe impacto de las diferentes provincias en la producción de trigo en Egipto; la ineficiencia técnica de la producción de trigo tendió a disminuir durante el período de estudio; y no hay ningún impacto de género en la producción de trigo en Egipto. Los niveles de eficiencia técnica varían entre las diferentes provincias para las especificaciones de Battese y Coelli (1992) y (1995); el nivel mínimo medio de eficiencia técnica es 91.61% en la provincia de Fayoum, mientras que el nivel máximo medio de la eficiencia técnica es 98.69% en la provincia de Dakahlia. La eficiencia técnica toma un valor medio de 95.37%, lo que implica poco potencial para mejorar la eficiencia de uso de recursos en la producción de trigo. La TFPCH de la producción de trigo en Egipto durante el período 1990-2012 tiene un valor menor que uno y muestra un declive. Esta disminución es debida más al componente de cambio técnico que al componente de cambio de eficiencia. La disminución de TFPCH mejora con el tiempo. La provincia de Menoufia tiene la menor disminución en TFPCH, 6.5%, mientras que dos provincias, Sharkia y Dakahlia, son las que más disminuyen en TFPCH, 13.1%, en cada uno de ellas. Menos disminución en TFPCH ocurre en el período 2009-2010, 0.3%, mientras que más disminución se produce en TFPCH en el período 1990-1991, 38.9%. La disminución de la PTF de la producción de trigo en Egipto se atribuye principalmente a la mala aplicación de la tecnología. ABSTRACT The objectives of this study are to estimate the efficiency and productivity of the main governorates of wheat production in Egypt. The data used in this study is a panel data at the governorates level, it represents the time period 1990-2012 and taken from the Ministry of Agriculture and Land Reclamation, and the Central Agency for Public Mobilization and Statistics, Egypt. We apply the stochastic frontier approach for efficiency measurement (Cobb-Douglas production function) and the specifications of Battese and Coelli (1992) and (1995) are employed. Also we use Malmquist TFP index as a non-parametric approach (DEA) to decompose total factor productivity of the main governorates of wheat production in Egypt into technical change and efficiency change. The coefficient of land is positive and significant at Battese and Coelli (1992) and (1995) specifications, implying that increasing the wheat area could significantly enhance the production of wheat. The coefficient of labor is positive and significant at Battese and Coelli (1992) specification, while it is positive and insignificant at Battese and Coelli (1995) specification. The coefficient of machinery is negative and insignificant at the specifications of Battese and Coelli (1992) and (1995). The technical change coefficient is positive and insignificant at Battese and Coelli (1992) specification, while it is positive and significant at Battese and Coelli (1995) specification. The variables of the inefficiency effect model indicate that there is no impact from the location of the different governorates on wheat production in Egypt, the technical inefficiency of wheat production tended to decrease through the period of study, and there is no impact from the gender on wheat production in Egypt. The levels of technical efficiency vary among the different governorates for the specifications of Battese and Coelli (1992) and (1995); the minimum mean level of technical efficiency is 91.61% at Fayoum governorate, while the maximum mean level of technical efficiency is 98.69% at Dakahlia governorate. The technical efficiency takes an average value of 95.37%, this implying that little potential exists to improve resource use efficiency in wheat production. The TFPCH of wheat production in Egypt during the time period 1990-2012 has a value less than one and shows a decline; this decline is due mainly to the technical change component than the efficiency change component. The decline in TFPCH is generally improves over time. Menoufia governorate has the least declining in TFPCH by 6.5%, while two governorates, Sharkia and Dakahlia have the most declining in TFPCH by 13.1% for each of them. The least declining in TFPCH occurred at the period 2009- 2010 by 0.3%, while the most declining in TFPCH occurred at the period 1990-1991 by 38.9%. The declining in TFP of wheat production in Egypt is attributed mainly to poor application of technology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tesis doctoral presenta el desarrollo, verificación y aplicación de un método original de regionalización estadística para generar escenarios locales de clima futuro de temperatura y precipitación diarias, que combina dos pasos. El primer paso es un método de análogos: los "n" días cuya configuración atmosférica de baja resolución es más parecida a la del día problema, se seleccionan de un banco de datos de referencia del pasado. En el segundo paso, se realiza un análisis de regresión múltiple sobre los "n" días más análogos para la temperatura, mientras que para la precipitación se utiliza la distribución de probabilidad de esos "n" días análogos para obtener la estima de precipitación. La verificación de este método se ha llevado a cabo para la España peninsular y las Islas Baleares. Los resultados muestran unas buenas prestaciones para temperatura (BIAS cerca de 0.1ºC y media de errores absolutos alrededor de 1.9ºC); y unas prestaciones aceptables para la precipitación (BIAS razonablemente bajo con una media de -18%; error medio absoluto menor que para una simulación de referencia (la persistencia); y una distribución de probabilidad simulada similar a la observada según dos test no-paramétricos de similitud). Para mostrar la aplicabilidad de la metodología desarrollada, se ha aplicado en detalle en un caso de estudio. El método se aplicó a cuatro modelos climáticos bajo diferentes escenarios futuros de emisiones de gases de efecto invernadero, para la región de Aragón, produciendo así proyecciones futuras de precipitación y temperaturas máximas y mínimas diarias. La fiabilidad de la técnica de regionalización fue evaluada de nuevo para el caso de estudio mediante un proceso de verificación. Para determinar la capacidad de los modelos climáticos para simular el clima real, sus simulaciones del pasado (la denominada salida 20C3M) se regionalizaron y luego se compararon con el clima observado (los resultados son bastante robustos para la temperatura y menos concluyentes para la precipitación). Las proyecciones futuras a escala local presentan un aumento significativo durante todo el siglo XXI de las temperaturas máximas y mínimas para todos los futuros escenarios de emisiones considerados. Las simulaciones de precipitación presentan mayores incertidumbres. Además, la aplicabilidad práctica del método se demostró también mediante su utilización para producir escenarios climáticos futuros para otros casos de estudio en los distintos sectores y regiones del mundo. Se ha prestado especial atención a una aplicación en Centroamérica, una región que ya está sufriendo importantes impactos del cambio climático y que tiene un clima muy diferente. ABSTRACT This doctoral thesis presents the development, verification and application of an original downscaling method for daily temperature and precipitation, which combines two statistical approaches. The first step is an analogue approach: the “n” days most similar to the day to be downscaled are selected. In the second step, a multiple regression analysis using the “n” most analogous days is performed for temperature, whereas for precipitation the probability distribution of the “n” analogous days is used to obtain the amount of precipitation. Verification of this method has been carried out for the Spanish Iberian Peninsula and the Balearic Islands. Results show good performance for temperature (BIAS close to 0.1ºC and Mean Absolute Errors around 1.9ºC); and an acceptable skill for precipitation (reasonably low BIAS with a mean of - 18%, Mean Absolute Error lower than for a reference simulation, i.e. persistence, and a well-simulated probability distribution according to two non-parametric tests of similarity). To show the applicability of the method, a study case has been analyzed. The method was applied to four climate models under different future emission scenarios for the region of Aragón, thus producing future projections of daily precipitation and maximum and minimum temperatures. The reliability of the downscaling technique was re-assessed for the study case by a verification process. To determine the ability of the climate models to simulate the real climate, their simulations of the past (the 20C3M output) were downscaled and then compared with the observed climate – the results are quite robust for temperature and less conclusive for the precipitation. The downscaled future projections exhibit a significant increase during the entire 21st century of the maximum and minimum temperatures for all the considered future emission scenarios. Precipitation simulations exhibit greater uncertainties. Furthermore, the practical applicability of the method was demonstrated also by using it to produce future climate scenarios for some other study cases in different sectors and regions of the world. Special attention was paid to an application of the method in Central America, a region that is already suffering from significant climate change impacts and that has a very different climate from others where the method was previously applied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents the experimental results obtained by applying frequency-domain structural health monitoring techniques to assess the damage suffered on a special type of damper called Web Plastifying Damper (WPD). The WPD is a hysteretic type energy dissipator recently developed for the passive control of structures subjected to earthquakes. It consists of several I-section steel segments connected in parallel. The energy is dissipated through plastic deformations of the web of the I-sections, which constitute the dissipative parts of the damper. WPDs were subjected to successive histories of dynamically-imposed cyclic deformations of increasing magnitude with the shaking table of the University of Granada. To assess the damage to the web of the I-section steel segments after each history of loading, a new damage index called Area Index of Damage (AID) was obtained from simple vibration tests. The vibration signals were acquired by means of piezoelectric sensors attached on the I-sections, and non-parametric statistical methods were applied to calculate AID in terms of changes in frequency response functions. The damage index AID was correlated with another energy-based damage index-ID- which past research has proven to accurately characterize the level of mechanical damage. The ID is rooted in the decomposition of the load-displacement curve experienced by the damper into the so-called skeleton and Bauschinger parts. ID predicts the level of damage and the proximity to failure of the damper accurately, but it requires costly instrumentation. The experiments reported in this paper demonstrate a good correlation between AID and ID in a realistic seismic loading scenario consisting of dynamically applied arbitrary cyclic loads. Based on this correlation, it is possible to estimate ID indirectly from the AID, which calls for much simpler and less expensive instrumentation.