848 resultados para Native Vegetation Condition, Benchmarking, Bayesian Decision Framework, Regression, Indicators
Resumo:
Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^
Resumo:
En esta tesis se aborda la detección y el seguimiento automático de vehículos mediante técnicas de visión artificial con una cámara monocular embarcada. Este problema ha suscitado un gran interés por parte de la industria automovilística y de la comunidad científica ya que supone el primer paso en aras de la ayuda a la conducción, la prevención de accidentes y, en última instancia, la conducción automática. A pesar de que se le ha dedicado mucho esfuerzo en los últimos años, de momento no se ha encontrado ninguna solución completamente satisfactoria y por lo tanto continúa siendo un tema de investigación abierto. Los principales problemas que plantean la detección y seguimiento mediante visión artificial son la gran variabilidad entre vehículos, un fondo que cambia dinámicamente debido al movimiento de la cámara, y la necesidad de operar en tiempo real. En este contexto, esta tesis propone un marco unificado para la detección y seguimiento de vehículos que afronta los problemas descritos mediante un enfoque estadístico. El marco se compone de tres grandes bloques, i.e., generación de hipótesis, verificación de hipótesis, y seguimiento de vehículos, que se llevan a cabo de manera secuencial. No obstante, se potencia el intercambio de información entre los diferentes bloques con objeto de obtener el máximo grado posible de adaptación a cambios en el entorno y de reducir el coste computacional. Para abordar la primera tarea de generación de hipótesis, se proponen dos métodos complementarios basados respectivamente en el análisis de la apariencia y la geometría de la escena. Para ello resulta especialmente interesante el uso de un dominio transformado en el que se elimina la perspectiva de la imagen original, puesto que este dominio permite una búsqueda rápida dentro de la imagen y por tanto una generación eficiente de hipótesis de localización de los vehículos. Los candidatos finales se obtienen por medio de un marco colaborativo entre el dominio original y el dominio transformado. Para la verificación de hipótesis se adopta un método de aprendizaje supervisado. Así, se evalúan algunos de los métodos de extracción de características más populares y se proponen nuevos descriptores con arreglo al conocimiento de la apariencia de los vehículos. Para evaluar la efectividad en la tarea de clasificación de estos descriptores, y dado que no existen bases de datos públicas que se adapten al problema descrito, se ha generado una nueva base de datos sobre la que se han realizado pruebas masivas. Finalmente, se presenta una metodología para la fusión de los diferentes clasificadores y se plantea una discusión sobre las combinaciones que ofrecen los mejores resultados. El núcleo del marco propuesto está constituido por un método Bayesiano de seguimiento basado en filtros de partículas. Se plantean contribuciones en los tres elementos fundamentales de estos filtros: el algoritmo de inferencia, el modelo dinámico y el modelo de observación. En concreto, se propone el uso de un método de muestreo basado en MCMC que evita el elevado coste computacional de los filtros de partículas tradicionales y por consiguiente permite que el modelado conjunto de múltiples vehículos sea computacionalmente viable. Por otra parte, el dominio transformado mencionado anteriormente permite la definición de un modelo dinámico de velocidad constante ya que se preserva el movimiento suave de los vehículos en autopistas. Por último, se propone un modelo de observación que integra diferentes características. En particular, además de la apariencia de los vehículos, el modelo tiene en cuenta también toda la información recibida de los bloques de procesamiento previos. El método propuesto se ejecuta en tiempo real en un ordenador de propósito general y da unos resultados sobresalientes en comparación con los métodos tradicionales. ABSTRACT This thesis addresses on-road vehicle detection and tracking with a monocular vision system. This problem has attracted the attention of the automotive industry and the research community as it is the first step for driver assistance and collision avoidance systems and for eventual autonomous driving. Although many effort has been devoted to address it in recent years, no satisfactory solution has yet been devised and thus it is an active research issue. The main challenges for vision-based vehicle detection and tracking are the high variability among vehicles, the dynamically changing background due to camera motion and the real-time processing requirement. In this thesis, a unified approach using statistical methods is presented for vehicle detection and tracking that tackles these issues. The approach is divided into three primary tasks, i.e., vehicle hypothesis generation, hypothesis verification, and vehicle tracking, which are performed sequentially. Nevertheless, the exchange of information between processing blocks is fostered so that the maximum degree of adaptation to changes in the environment can be achieved and the computational cost is alleviated. Two complementary strategies are proposed to address the first task, i.e., hypothesis generation, based respectively on appearance and geometry analysis. To this end, the use of a rectified domain in which the perspective is removed from the original image is especially interesting, as it allows for fast image scanning and coarse hypothesis generation. The final vehicle candidates are produced using a collaborative framework between the original and the rectified domains. A supervised classification strategy is adopted for the verification of the hypothesized vehicle locations. In particular, state-of-the-art methods for feature extraction are evaluated and new descriptors are proposed by exploiting the knowledge on vehicle appearance. Due to the lack of appropriate public databases, a new database is generated and the classification performance of the descriptors is extensively tested on it. Finally, a methodology for the fusion of the different classifiers is presented and the best combinations are discussed. The core of the proposed approach is a Bayesian tracking framework using particle filters. Contributions are made on its three key elements: the inference algorithm, the dynamic model and the observation model. In particular, the use of a Markov chain Monte Carlo method is proposed for sampling, which circumvents the exponential complexity increase of traditional particle filters thus making joint multiple vehicle tracking affordable. On the other hand, the aforementioned rectified domain allows for the definition of a constant-velocity dynamic model since it preserves the smooth motion of vehicles in highways. Finally, a multiple-cue observation model is proposed that not only accounts for vehicle appearance but also integrates the available information from the analysis in the previous blocks. The proposed approach is proven to run near real-time in a general purpose PC and to deliver outstanding results compared to traditional methods.
Resumo:
El proyecto geotécnico de columnas de grava tiene todas las incertidumbres asociadas a un proyecto geotécnico y además hay que considerar las incertidumbres inherentes a la compleja interacción entre el terreno y la columna, la puesta en obra de los materiales y el producto final conseguido. Este hecho es común a otros tratamientos del terreno cuyo objetivo sea, en general, la mejora “profunda”. Como los métodos de fiabilidad (v.gr., FORM, SORM, Monte Carlo, Simulación Direccional) dan respuesta a la incertidumbre de forma mucho más consistente y racional que el coeficiente de seguridad tradicional, ha surgido un interés reciente en la aplicación de técnicas de fiabilidad a la ingeniería geotécnica. Si bien la aplicación concreta al proyecto de técnicas de mejora del terreno no es tan extensa. En esta Tesis se han aplicado las técnicas de fiabilidad a algunos aspectos del proyecto de columnas de grava (estimación de asientos, tiempos de consolidación y aumento de la capacidad portante) con el objetivo de efectuar un análisis racional del proceso de diseño, considerando los efectos que tienen la incertidumbre y la variabilidad en la seguridad del proyecto, es decir, en la probabilidad de fallo. Para alcanzar este objetivo se ha utilizado un método analítico avanzado debido a Castro y Sagaseta (2009), que mejora notablemente la predicción de las variables involucradas en el diseño del tratamiento y su evolución temporal (consolidación). Se ha estudiado el problema del asiento (valor y tiempo de consolidación) en el contexto de la incertidumbre, analizando dos modos de fallo: i) el primer modo representa la situación en la que es posible finalizar la consolidación primaria, parcial o totalmente, del terreno mejorado antes de la ejecución de la estructura final, bien sea por un precarga o porque la carga se pueda aplicar gradualmente sin afectar a la estructura o instalación; y ii) por otra parte, el segundo modo de fallo implica que el terreno mejorado se carga desde el instante inicial con la estructura definitiva o instalación y se comprueba que el asiento final (transcurrida la consolidación primaria) sea lo suficientemente pequeño para que pueda considerarse admisible. Para trabajar con valores realistas de los parámetros geotécnicos, los datos se han obtenido de un terreno real mejorado con columnas de grava, consiguiendo, de esta forma, un análisis de fiabilidad más riguroso. La conclusión más importante, obtenida del análisis de este caso particular, es la necesidad de precargar el terreno mejorado con columnas de grava para conseguir que el asiento ocurra de forma anticipada antes de la aplicación de la carga correspondiente a la estructura definitiva. De otra forma la probabilidad de fallo es muy alta, incluso cuando el margen de seguridad determinista pudiera ser suficiente. En lo que respecta a la capacidad portante de las columnas, existen un buen número de métodos de cálculo y de ensayos de carga (tanto de campo como de laboratorio) que dan predicciones dispares del valor de la capacidad última de las columnas de grava. En las mallas indefinidas de columnas, los resultados del análisis de fiabilidad han confirmado las consideraciones teóricas y experimentales existentes relativas a que no se produce fallo por estabilidad, obteniéndose una probabilidad de fallo prácticamente nula para este modo de fallo. Sin embargo, cuando se analiza, en el contexto de la incertidumbre, la capacidad portante de pequeños grupos de columnas bajo zapatas se ha obtenido, para un caso con unos parámetros geotécnicos típicos, que la probabilidad de fallo es bastante alta, por encima de los umbrales normalmente admitidos para Estados Límite Últimos. Por último, el trabajo de recopilación sobre los métodos de cálculo y de ensayos de carga sobre la columna aislada ha permitido generar una base de datos suficientemente amplia como para abordar una actualización bayesiana de los métodos de cálculo de la columna de grava aislada. El marco bayesiano de actualización ha resultado de utilidad en la mejora de las predicciones de la capacidad última de carga de la columna, permitiendo “actualizar” los parámetros del modelo de cálculo a medida que se dispongan de ensayos de carga adicionales para un proyecto específico. Constituye una herramienta valiosa para la toma de decisiones en condiciones de incertidumbre ya que permite comparar el coste de los ensayos adicionales con el coste de una posible rotura y , en consecuencia, decidir si es procedente efectuar dichos ensayos. The geotechnical design of stone columns has all the uncertainties associated with a geotechnical project and those inherent to the complex interaction between the soil and the column, the installation of the materials and the characteristics of the final (as built) column must be considered. This is common to other soil treatments aimed, in general, to “deep” soil improvement. Since reliability methods (eg, FORM, SORM, Monte Carlo, Directional Simulation) deals with uncertainty in a much more consistent and rational way than the traditional safety factor, recent interest has arisen in the application of reliability techniques to geotechnical engineering. But the specific application of these techniques to soil improvement projects is not as extensive. In this thesis reliability techniques have been applied to some aspects of stone columns design (estimated settlements, consolidation times and increased bearing capacity) to make a rational analysis of the design process, considering the effects of uncertainty and variability on the safety of the project, i.e., on the probability of failure. To achieve this goal an advanced analytical method due to Castro and Sagaseta (2009), that significantly improves the prediction of the variables involved in the design of treatment and its temporal evolution (consolidation), has been employed. This thesis studies the problem of stone column settlement (amount and speed) in the context of uncertainty, analyzing two failure modes: i) the first mode represents the situation in which it is possible to cause primary consolidation, partial or total, of the improved ground prior to implementation of the final structure, either by a pre-load or because the load can be applied gradually or programmed without affecting the structure or installation; and ii) on the other hand, the second mode implies that the improved ground is loaded from the initial instant with the final structure or installation, expecting that the final settlement (elapsed primary consolidation) is small enough to be allowable. To work with realistic values of geotechnical parameters, data were obtained from a real soil improved with stone columns, hence producing a more rigorous reliability analysis. The most important conclusion obtained from the analysis of this particular case is the need to preload the stone columns-improved soil to make the settlement to occur before the application of the load corresponding to the final structure. Otherwise the probability of failure is very high, even when the deterministic safety margin would be sufficient. With respect to the bearing capacity of the columns, there are numerous methods of calculation and load tests (both for the field and the laboratory) giving different predictions of the ultimate capacity of stone columns. For indefinite columns grids, the results of reliability analysis confirmed the existing theoretical and experimental considerations that no failure occurs due to the stability failure mode, therefore resulting in a negligible probability of failure. However, when analyzed in the context of uncertainty (for a case with typical geotechnical parameters), results show that the probability of failure due to the bearing capacity failure mode of a group of columns is quite high, above thresholds usually admitted for Ultimate Limit States. Finally, the review of calculation methods and load tests results for isolated columns, has generated a large enough database, that allowed a subsequent Bayesian updating of the methods for calculating the bearing capacity of isolated stone columns. The Bayesian updating framework has been useful to improve the predictions of the ultimate load capacity of the column, allowing to "update" the parameters of the calculation model as additional load tests become available for a specific project. Moreover, it is a valuable tool for decision making under uncertainty since it is possible to compare the cost of further testing to the cost of a possible failure and therefore to decide whether it is appropriate to perform such tests.
Resumo:
PURPOSE The decision-making process plays a key role in organizations. Every decision-making process produces a final choice that may or may not prompt action. Recurrently, decision makers find themselves in the dichotomous question of following a traditional sequence decision-making process where the output of a decision is used as the input of the next stage of the decision, or following a joint decision-making approach where several decisions are taken simultaneously. The implication of the decision-making process will impact different players of the organization. The choice of the decision- making approach becomes difficult to find, even with the current literature and practitioners’ knowledge. The pursuit of better ways for making decisions has been a common goal for academics and practitioners. Management scientists use different techniques and approaches to improve different types of decisions. The purpose of this decision is to use the available resources as well as possible (data and techniques) to achieve the objectives of the organization. The developing and applying of models and concepts may be helpful to solve managerial problems faced every day in different companies. As a result of this research different decision models are presented to contribute to the body of knowledge of management science. The first models are focused on the manufacturing industry and the second part of the models on the health care industry. Despite these models being case specific, they serve the purpose of exemplifying that different approaches to the problems and could provide interesting results. Unfortunately, there is no universal recipe that could be applied to all the problems. Furthermore, the same model could deliver good results with certain data and bad results for other data. A framework to analyse the data before selecting the model to be used is presented and tested in the models developed to exemplify the ideas. METHODOLOGY As the first step of the research a systematic literature review on the joint decision is presented, as are the different opinions and suggestions of different scholars. For the next stage of the thesis, the decision-making process of more than 50 companies was analysed in companies from different sectors in the production planning area at the Job Shop level. The data was obtained using surveys and face-to-face interviews. The following part of the research into the decision-making process was held in two application fields that are highly relevant for our society; manufacturing and health care. The first step was to study the interactions and develop a mathematical model for the replenishment of the car assembly where the problem of “Vehicle routing problem and Inventory” were combined. The next step was to add the scheduling or car production (car sequencing) decision and use some metaheuristics such as ant colony and genetic algorithms to measure if the behaviour is kept up with different case size problems. A similar approach is presented in a production of semiconductors and aviation parts, where a hoist has to change from one station to another to deal with the work, and a jobs schedule has to be done. However, for this problem simulation was used for experimentation. In parallel, the scheduling of operating rooms was studied. Surgeries were allocated to surgeons and the scheduling of operating rooms was analysed. The first part of the research was done in a Teaching hospital, and for the second part the interaction of uncertainty was added. Once the previous problem had been analysed a general framework to characterize the instance was built. In the final chapter a general conclusion is presented. FINDINGS AND PRACTICAL IMPLICATIONS The first part of the contributions is an update of the decision-making literature review. Also an analysis of the possible savings resulting from a change in the decision process is made. Then, the results of the survey, which present a lack of consistency between what the managers believe and the reality of the integration of their decisions. In the next stage of the thesis, a contribution to the body of knowledge of the operation research, with the joint solution of the replenishment, sequencing and inventory problem in the assembly line is made, together with a parallel work with the operating rooms scheduling where different solutions approaches are presented. In addition to the contribution of the solving methods, with the use of different techniques, the main contribution is the framework that is proposed to pre-evaluate the problem before thinking of the techniques to solve it. However, there is no straightforward answer as to whether it is better to have joint or sequential solutions. Following the proposed framework with the evaluation of factors such as the flexibility of the answer, the number of actors, and the tightness of the data, give us important hints as to the most suitable direction to take to tackle the problem. RESEARCH LIMITATIONS AND AVENUES FOR FUTURE RESEARCH In the first part of the work it was really complicated to calculate the possible savings of different projects, since in many papers these quantities are not reported or the impact is based on non-quantifiable benefits. The other issue is the confidentiality of many projects where the data cannot be presented. For the car assembly line problem more computational power would allow us to solve bigger instances. For the operation research problem there was a lack of historical data to perform a parallel analysis in the teaching hospital. In order to keep testing the decision framework it is necessary to keep applying more case studies in order to generalize the results and make them more evident and less ambiguous. The health care field offers great opportunities since despite the recent awareness of the need to improve the decision-making process there are many opportunities to improve. Another big difference with the automotive industry is that the last improvements are not spread among all the actors. Therefore, in the future this research will focus more on the collaboration between academia and the health care sector.
Resumo:
Globally, increasing demands for biofuels have intensified the rate of land-use change (LUC) for expansion of bioenergy crops. In Brazil, the world\'s largest sugarcane-ethanol producer, sugarcane area has expanded by 35% (3.2 Mha) in the last decade. Sugarcane expansion has resulted in extensive pastures being subjected to intensive mechanization and large inputs of agrochemicals, which have direct implications on soil quality (SQ). We hypothesized that LUC to support sugarcane expansion leads to overall SQ degradation. To test this hypothesis we conducted a field-study at three sites in the central-southern region, to assess the SQ response to the primary LUC sequence (i.e., native vegetation to pasture to sugarcane) associated to sugarcane expansion in Brazil. At each land use site undisturbed and disturbed soil samples were collected from the 0-10, 10-20 and 20-30 cm depths. Soil chemical and physical attributes were measured through on-farm and laboratory analyses. A dataset of soil biological attributes was also included in this study. Initially, the LUC effects on each individual soil indicator were quantified. Afterward, the LUC effects on overall SQ were assessed using the Soil Management Assessment Framework (SMAF). Furthermore, six SQ indexes (SQI) were developed using approaches with increasing complexity. Our results showed that long-term conversion from native vegetation to extensive pasture led to soil acidification, significant depletion of soil organic carbon (SOC) and macronutrients [especially phosphorus (P)] and severe soil compaction, which creates an unbalanced ratio between water- and air-filled pore space within the soil and increases mechanical resistance to root growth. Conversion from pasture to sugarcane improved soil chemical quality by correcting for acidity and increasing macronutrient levels. Despite those improvements, most of the P added by fertilizer accumulated in less plant-available P forms, confirming the key role of organic P has in providing available P to plants in Brazilian soils. Long-term sugarcane production subsequently led to further SOC depletions. Sugarcane production had slight negative impacts on soil physical attributes compared to pasture land. Although tillage performed for sugarcane planting and replanting alleviates soil compaction, our data suggested that the effects are short-term with persistent, reoccurring soil consolidation that increases erosion risk over time. These soil physical changes, induced by LUC, were detected by quantitative soil physical properties as well as by visual evaluation of soil structure (VESS), an on-farm and user-friendly method for evaluating SQ. The SMAF efficiently detected overall SQ response to LUC and it could be reliably used under Brazilian soil conditions. Furthermore, since all of the SQI values developed in this study were able to rank SQ among land uses. We recommend that simpler and more cost-effective SQI strategies using a small number of carefully chosen soil indicators, such as: pH, P, K, VESS and SOC, and proportional weighting within of each soil sectors (chemical, physical and biological) be used as a protocol for SQ assessments in Brazilian sugarcane areas. The SMAF and SQI scores suggested that long-term conversion from native vegetation to extensive pasture depleted overall SQ, driven by decreases in chemical, physical and biological indicators. In contrast, conversion from pasture to sugarcane had no negative impacts on overall SQ, mainly because chemical improvements offset negative impacts on biological and physical indicators. Therefore, our findings can be used as scientific base by farmers, extension agents and public policy makers to adopt and develop management strategies that sustain and/or improving SQ and the sustainability of sugarcane production in Brazil.
Resumo:
The aim of this thesis was to evaluate historical change of the landscape of Madeira Island and to assess spatial and temporal vegetation dynamics. In current research diverse “retrospective techniques”, such as landscape repeat photography, dendrochronology, and research of historical records were used. These, combined with vegetation relevés, aimed to gather information about landscape change, disturbance history, and vegetation successional patterns. It was found that landscape change, throughout 125 years, was higher in the last five decades manly driven by farming abandonment, building growth and exotic vegetation coverage increase. Pristine vegetation was greatly destroyed since early settlement and by the end of the nineteenth century native vegetation was highly devastated due to recurrent antropogenic disturbances. These actions also helped to block plant succession and to modify floristical assemblages, affecting as well as species richness. In places with less hemeroby, although significant growth of vegetation of lower seral stages was detected, the vegetation of most mature stages headed towards unbalance between recovery and loss, being also very vulnerable to exotic species encroachment. Recovery by native vegetation also occurred in areas formerly occupied by exotic plants and agriculture but it was almost negligible. Vegetation recovery followed the successional model currently proposed, attesting the model itself. Yet, succession was slower than espected, due to lack of favourable conditions and to recurrent disturbances. Probable tempus of each seral stage was obtained by growth rates of woody taxa estimated through dendrochronology. The exotic trees which were the dominant trees in the past (Castanea sativa and Pinus pinaster) almost vanished. Eucalyptus globulus, the current main tree of the exotic forest is being replaced by other cover types as Acacia mearnsii. The latter, along with Arundo donax, Cytisus scoparius and Pittosporum undulatum are currently the exotic species with higher invasive behaviour. However, many other exotic species have also proved to be highly pervasive and came together with the ones referred above to prevent native vegetation regeneration, to diminish biological diversity, and to block early successional phases delaying native forest recovery.
Resumo:
Bayesian decision theory is increasingly applied to support decision-making processes under environmental variability and uncertainty. Researchers from application areas like psychology and biomedicine have applied these techniques successfully. However, in the area of software engineering and speci?cally in the area of self-adaptive systems (SASs), little progress has been made in the application of Bayesian decision theory. We believe that techniques based on Bayesian Networks (BNs) are useful for systems that dynamically adapt themselves at runtime to a changing environment, which is usually uncertain. In this paper, we discuss the case for the use of BNs, speci?cally Dynamic Decision Networks (DDNs), to support the decision-making of self-adaptive systems. We present how such a probabilistic model can be used to support the decision making in SASs and justify its applicability. We have applied our DDN-based approach to the case of an adaptive remote data mirroring system. We discuss results, implications and potential bene?ts of the DDN to enhance the development and operation of self-adaptive systems, by providing mechanisms to cope with uncertainty and automatically make the best decision.
Resumo:
An inference task in one in which some known set of information is used to produce an estimate about an unknown quantity. Existing theories of how humans make inferences include specialized heuristics that allow people to make these inferences in familiar environments quickly and without unnecessarily complex computation. Specialized heuristic processing may be unnecessary, however; other research suggests that the same patterns in judgment can be explained by existing patterns in encoding and retrieving memories. This dissertation compares and attempts to reconcile three alternate explanations of human inference. After justifying three hierarchical Bayesian version of existing inference models, the three models are com- pared on simulated, observed, and experimental data. The results suggest that the three models capture different patterns in human behavior but, based on posterior prediction using laboratory data, potentially ignore important determinants of the decision process.
Resumo:
The main goal of LISA Path finder (LPF) mission is to estimate the acceleration noise models of the overall LISA Technology Package (LTP) experiment on-board. This will be of crucial importance for the future space-based Gravitational-Wave (GW) detectors, like eLISA. Here, we present the Bayesian analysis framework to process the planned system identification experiments designed for that purpose. In particular, we focus on the analysis strategies to predict the accuracy of the parameters that describe the system in all degrees of freedom. The data sets were generated during the latest operational simulations organised by the data analysis team and this work is part of the LTPDA Matlab toolbox.
Resumo:
O nitrogênio e um dos nutrientes mais demandados pelas espécies vegetais, sua presença no solo, sob formas orgânicas ou minerais disponíveis para as plantas, está vinculada à qualidade e quantidade dos resíduos vegetais aportados ao solo. O estudo teve o objetivo de avaliar a influência do cultivo do eucalipto e da acácia na composição das formas orgânicas e inorgânicas de N e, na abundância natural de 15N em um Argissolo Amarelo. Para isso, foram coletadas amostras de solo e serapilheira em monocultivos do Eucalyptus urograndis (clone do Eucalyptus urophylla S. T. Blake x Eucalyptus grandis W. Hill ex Spreng) de ciclo curto (sete anos), sistemas de cultivo de rotação com acácia ( Acacia mangium Willd.) após monocultivo de eucalipto, monocultivo de eucalipto de ciclo longo (24 anos) e mata nativa (Mata Atlântica) como condição original de solo do litoral Norte do Espírito do Santo. Foram avaliados os teores de C orgânico total, N total, N-NH4+, N-NO3-, relação C/N, fracionamento do N orgânico e abundância natural de 15N no solo e serapilheira. Das formas de N-orgânico hidrolisado, o N-amino foi a fração que apresentou maior contribuição (39%), seguida pela fração de N-não identificado (27%), da fração N-amida (18%) e N-hexosamina (15%). O povoamento de acácia promoveu menor abundância natural de 15N e maiores teores de N total e C orgânico no solo e aumentou as formas orgânicas de N-hidrolisado, quando comparado àqueles de eucalipto de ciclo curto. Isso indica o aumento de formas lábeis de N orgânico no solo para as plantas e redução da humificação da matéria orgânica do solo (MOS) de acácia. Nesse sentido, a rotação de cultivos florestais com acácia após eucalipto de ciclo curto contribuiu para o aumento de formas orgânicas no solo, importantes para a nutrição de plantas, por serem potenciais fontes de nutrientes às plantas em curto período de tempo.
Resumo:
The influences of clearing native vegetation (Caatinga) in contour strips at 25 cm vertical interval on evaporation losses in cleared strips, annual runoff efficiency and annuall soil loss on gently sloped micro-waterheds in the arid zones of Northeast Brazil are reported. The alternate native vegetation (Caatinga) strips function very effectively as windbreaks thus reducing evaporation losses substantially in the leeward cleared strips. The runoff measured at the micro-watershed with cleared strips was many-fold lower than the runoff obtained at a completely denuded watershed even when it was protected by narrow based channel terraces. However, the annual runoff efficiency can be significantly increased in a strip cleared watershed if narrow based channel terraces are provided on the lower side of cleared strips. The annual soil losses in strip cleared watersheds as well as completely denuded waterhed of gentle slopes were negligible. Thus clearing land in alternate contour strips on a micro-watersheds shall substantially improve crop water use efficiency without creating any significant erosion problems. Additionally this treatment will increase runoff for water harvesting for irrigation purposes.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Centro de Desenvolvimento Sustentável, Programa de Pós-Graduação em Desenvolvimento Sustentável, 2016.
Evaluation of the risk factors contributing to the African swine fever occurrence in Sardinia, Italy
Resumo:
This study assesses the relation between hypothesized risk factors and African swine fever virus (ASFV) distribution in Sardinia (Italy) after the beginning of the eradication program in 1993, using a Bayesian multivariable logistic regression mixed model. Results indicate that the probability of ASFV occurrence in Sardinia was associated to particular socio-cultural, productive and economical factors found in the region, particularly to large number of confined (i.e., closed) farms (most of them backyard), high road density, high mean altitude, large number of open fattening farms, and large number of pigs per commune. Conversely, large proportion of open farms with at least one census and large proportion of open farms per commune, were found to be protective factors for ASFV. Results suggest that basic preventive and control strategies, such as yearly census or registration of the pigs per farm and better control of the public lands where pigs are usually raised, together with endanced effords of outreach and communication with pig producers should help in the success of the eradication program for ASF in the Island. Methods and results presented here will inform decision making to better control and eradicate ASF in Sardinia and in all those areas with similar management and epidemiological conditions.