890 resultados para phylogeography, consensus approach, ensemble modeling, Pleistocene, ENM, ecological niche modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis has covered various aspects of modeling and analysis of finite mean time series with symmetric stable distributed innovations. Time series analysis based on Box and Jenkins methods are the most popular approaches where the models are linear and errors are Gaussian. We highlighted the limitations of classical time series analysis tools and explored some generalized tools and organized the approach parallel to the classical set up. In the present thesis we mainly studied the estimation and prediction of signal plus noise model. Here we assumed the signal and noise follow some models with symmetric stable innovations.We start the thesis with some motivating examples and application areas of alpha stable time series models. Classical time series analysis and corresponding theories based on finite variance models are extensively discussed in second chapter. We also surveyed the existing theories and methods correspond to infinite variance models in the same chapter. We present a linear filtering method for computing the filter weights assigned to the observation for estimating unobserved signal under general noisy environment in third chapter. Here we consider both the signal and the noise as stationary processes with infinite variance innovations. We derived semi infinite, double infinite and asymmetric signal extraction filters based on minimum dispersion criteria. Finite length filters based on Kalman-Levy filters are developed and identified the pattern of the filter weights. Simulation studies show that the proposed methods are competent enough in signal extraction for processes with infinite variance.Parameter estimation of autoregressive signals observed in a symmetric stable noise environment is discussed in fourth chapter. Here we used higher order Yule-Walker type estimation using auto-covariation function and exemplify the methods by simulation and application to Sea surface temperature data. We increased the number of Yule-Walker equations and proposed a ordinary least square estimate to the autoregressive parameters. Singularity problem of the auto-covariation matrix is addressed and derived a modified version of the Generalized Yule-Walker method using singular value decomposition.In fifth chapter of the thesis we introduced partial covariation function as a tool for stable time series analysis where covariance or partial covariance is ill defined. Asymptotic results of the partial auto-covariation is studied and its application in model identification of stable auto-regressive models are discussed. We generalize the Durbin-Levinson algorithm to include infinite variance models in terms of partial auto-covariation function and introduce a new information criteria for consistent order estimation of stable autoregressive model.In chapter six we explore the application of the techniques discussed in the previous chapter in signal processing. Frequency estimation of sinusoidal signal observed in symmetric stable noisy environment is discussed in this context. Here we introduced a parametric spectrum analysis and frequency estimate using power transfer function. Estimate of the power transfer function is obtained using the modified generalized Yule-Walker approach. Another important problem in statistical signal processing is to identify the number of sinusoidal components in an observed signal. We used a modified version of the proposed information criteria for this purpose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Safety critical software failure can have a high price. Such software should be free of errors before it is put into operation. Application of formal methods in the Software Development Life Cycle helps to ensure that the software for safety critical missions are ultra reliable. PVS theorem prover, a formal method tool, can be used for the formal verification of software in ADA Language for Flight Software Application (ALFA.). This paper describes the modeling of ALFA programs for PVS theorem prover. An ALFA2PVS translator is developed which automatically converts the software in ALFA to PVS specification. By this approach the software can be verified formally with respect to underflow/overflow errors and divide by zero conditions without the actual execution of the code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Safety critical software failure can have a high price. Such software should be free of errors before it is put into operation. Application of formal methods in the Software Development Life Cycle helps to ensure that the software for safety critical missions are ultra reliable. PVS theorem prover, a formal method tool, can be used for the formal verification of software in ADA Language for Flight Software Application (ALFA.). This paper describes the modeling of ALFA programs for PVS theorem prover. An ALFA2PVS translator is developed which automatically converts the software in ALFA to PVS specification. By this approach the software can be verified formally with respect to underflow/overflow errors and divide by zero conditions without the actual execution of the code

Relevância:

30.00% 30.00%

Publicador:

Resumo:

mbikulam Tiger Reserve of Western Ghats using Geospatial technology. The major objectives of the study are Land use land cover mapping (LULC) and Phytodiversity analysis. Satellite data was used to map the land use / land cover using supervised classification techniques in Erdas imagine. The change for a period of 32 years was assessed using the multi-temporal satellite datasets from Landsat MSS (1973), Landsat TM (1990), and IRS P6 LISS III (2005). A geospatial approach was used for the land cover analysis. Digital elevation models, Satellite imageries and SOI topo sheets were the data sets used in the analysis. Vegetation sampling plots distributed over the different forest types were enumerated and studied for Phytodiversity analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise Modeling (EM) is currently in operation either as a technique to represent and understand the structure and behavior of the enterprise, or as a technique to analyze business processes, and in many cases as support technique for business process reengineering. However, EM architectures and methods for Enterprise Engineering can also used to support new management techniques like SIX SIGMA, because these new techniques need a clear, transparent and integrated definition and description of the business activities of the enterprise to be able to build up, optimize and operate an successful enterprise. The main goal of SIX SIGMA is to optimize the performance of processes. A still open question is: "What are the adequate Quality criteria and methods to ensure such performance? What must we do to get Quality governance?" This paper describes a method including an Enterprise Engineering method and SIX SIGMA strategy to reach Quality Governance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise Modeling (EM) is currently in operation either as a technique to represent and understand the structure and behavior of the enterprise, or as a technique to analyze business processes, and in many cases as support technique for business process reengineering. However, EM architectures and methodes for Enterprise Engineering can also used to support new management techniques like SIX SIGMA, because these new techniques need a clear, transparent and integrated definition and description of the business activities of the enterprise to be able to build up, to optimize and to operate an successful enterprise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study describes a combined empirical/modeling approach to assess the possible impact of climate variability on rice production in the Philippines. We collated climate data of the last two decades (1985-2002) as well as yield statistics of six provinces of the Philippines, selected along a North-South gradient. Data from the climate information system of NASA were used as input parameters of the model ORYZA2000 to determine potential yields and, in the next steps, the yield gaps defined as the difference between potential and actual yields. Both simulated and actual yields of irrigated rice varied strongly between years. However, no climate-driven trends were apparent and the variability in actual yields showed no correlation with climatic parameters. The observed variation in simulated yields was attributable to seasonal variations in climate (dry/wet season) and to climatic differences between provinces and agro-ecological zones. The actual yield variation between provinces was not related to differences in the climatic yield potential but rather to soil and management factors. The resulting yield gap was largest in remote and infrastructurally disfavored provinces (low external input use) with a high production potential (high solar radiation and day-night temperature differences). In turn, the yield gap was lowest in central provinces with good market access but with a relatively low climatic yield potential. We conclude that neither long-term trends nor the variability of the climate can explain current rice yield trends and that agroecological, seasonal, and management effects are over-riding any possible climatic variations. On the other hand the lack of a climate-driven trend in the present situation may be superseded by ongoing climate change in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the strategies and techniques researched and implemented by the International Union for Conservation of Nature (IUCN) in villages in the vicinity of Doi Mae Salong in Chiang Rai Province, Thailand. The strategies revolve around the paradigm linking poverty alleviation, conservation and landscape restoration. IUCN and its partners specifically researched and implemented schemes directed toward diversification of the household economy through alternative and sustainable intensified agriculture techniques based on balancing conservation and livelihood objectives. The projects aimed to reduce poverty and build the resilience of smallholders through decentralised governance arrangements including land use planning schemes and stakeholder negotiation. Considering the agro-ecological system on a catchment-wide scale enhances the conceptual understanding of each component, collectively forming a landscape matrix with requisite benefits for biodiversity, smallholder livelihoods and ecosystem services. In particular, the role of enhancing ecosystem services and functions in building socio-ecological resilience to vulnerabilities such as climate and economic variability is paramount in the process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Worldwide water managers are increasingly challenged to allocate sufficient and affordable water supplies to different water use sectors without further degrading river ecosystems and their valuable services to mankind. Since 1950 human population almost tripled, water abstractions increased by a factor of four, and the number of large dam constructions is about eight times higher today. From a hydrological perspective, the alteration of river flows (temporally and spatially) is one of the main consequences of global change and further impairments can be expected given growing population pressure and projected climate change. Implications have been addressed in numerous hydrological studies, but with a clear focus on human water demands. Ecological water requirements have often been neglected or addressed in a very simplistic manner, particularly from the large-scale perspective. With his PhD thesis, Christof Schneider took up the challenge to assess direct (dam operation and water abstraction) and indirect (climate change) impacts of human activities on river flow regimes and evaluate the consequences for river ecosystems by using a modeling approach. The global hydrology model WaterGAP3 (developed at CESR) was applied and further developed within this thesis to carry out several model experiments and assess anthropogenic river flow regime modifications and their effects on river ecosystems. To address the complexity of ecological water requirements the assessment is based on three main ideas: (i) the natural flow paradigm, (ii) the perception that different flows have different ecological functions, and (iii) the flood pulse concept. The thesis shows that WaterGAP3 performs well in representing ecologically relevant flow characteristics on a daily time step, and therefore justifies its application within this research field. For the first time a methodology was established to estimate bankfull flow on a 5 by 5 arc minute grid cell raster globally, which is a key parameter in eFlow assessments as it marks the point where rivers hydraulically connect to adjacent floodplains. Management of dams and water consumption pose a risk to floodplains and riparian wetlands as flood volumes are significantly reduced. The thesis highlights that almost one-third of 93 selected Ramsar sites are seriously affected by modified inundation patterns today, and in the future, inundation patterns are very likely to be further impaired as a result of new major dam initiatives and climate change. Global warming has been identified as a major threat to river flow regimes as rising temperatures, declining snow cover, changing precipitation patterns and increasing climate variability are expected to seriously modify river flow regimes in the future. Flow regimes in all climate zones will be affected, in particular the polar zone (Northern Scandinavia) with higher river flows during the year and higher flood peaks in spring. On the other side, river flows in the Mediterranean are likely to be even more intermittent in the future because of strong reductions in mean summer precipitation as well as a decrease in winter precipitation, leading to an increasing number of zero flow events creating isolated pools along the river and transitions from lotic to lentic waters. As a result, strong impacts on river ecosystem integrity can be expected. Already today, large amounts of water are withdrawn in this region for agricultural irrigation and climate change is likely to exacerbate the current situation of water shortages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops an approach to the construction of multidimensional stochastic models for intelligent systems exploring an underwater environment. It describes methods for building models by a three- dimensional spatial decomposition of stochastic, multisensor feature vectors. New sensor information is incrementally incorporated into the model by stochastic backprojection. Error and ambiguity are explicitly accounted for by blurring a spatial projection of remote sensor data before incorporation. The stochastic models can be used to derive surface maps or other representations of the environment. The methods are demonstrated on data sets from multibeam bathymetric surveying, towed sidescan bathymetry, towed sidescan acoustic imagery, and high-resolution scanning sonar aboard a remotely operated vehicle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological systems exhibit rich and complex behavior through the orchestrated interplay of a large array of components. It is hypothesized that separable subsystems with some degree of functional autonomy exist; deciphering their independent behavior and functionality would greatly facilitate understanding the system as a whole. Discovering and analyzing such subsystems are hence pivotal problems in the quest to gain a quantitative understanding of complex biological systems. In this work, using approaches from machine learning, physics and graph theory, methods for the identification and analysis of such subsystems were developed. A novel methodology, based on a recent machine learning algorithm known as non-negative matrix factorization (NMF), was developed to discover such subsystems in a set of large-scale gene expression data. This set of subsystems was then used to predict functional relationships between genes, and this approach was shown to score significantly higher than conventional methods when benchmarking them against existing databases. Moreover, a mathematical treatment was developed to treat simple network subsystems based only on their topology (independent of particular parameter values). Application to a problem of experimental interest demonstrated the need for extentions to the conventional model to fully explain the experimental data. Finally, the notion of a subsystem was evaluated from a topological perspective. A number of different protein networks were examined to analyze their topological properties with respect to separability, seeking to find separable subsystems. These networks were shown to exhibit separability in a nonintuitive fashion, while the separable subsystems were of strong biological significance. It was demonstrated that the separability property found was not due to incomplete or biased data, but is likely to reflect biological structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is a first draft of the principle of statistical modelling on coordinates. Several causes —which would be long to detail—have led to this situation close to the deadline for submitting papers to CODAWORK’03. The main of them is the fast development of the approach along the last months, which let appear previous drafts as obsolete. The present paper contains the essential parts of the state of the art of this approach from my point of view. I would like to acknowledge many clarifying discussions with the group of people working in this field in Girona, Barcelona, Carrick Castle, Firenze, Berlin, G¨ottingen, and Freiberg. They have given a lot of suggestions and ideas. Nevertheless, there might be still errors or unclear aspects which are exclusively my fault. I hope this contribution serves as a basis for further discussions and new developments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper sets out to identify the initial positions of the different decision makers who intervene in a group decision making process with a reduced number of actors, and to establish possible consensus paths between these actors. As a methodological support, it employs one of the most widely-known multicriteria decision techniques, namely, the Analytic Hierarchy Process (AHP). Assuming that the judgements elicited by the decision makers follow the so-called multiplicative model (Crawford and Williams, 1985; Altuzarra et al., 1997; Laininen and Hämäläinen, 2003) with log-normal errors and unknown variance, a Bayesian approach is used in the estimation of the relative priorities of the alternatives being compared. These priorities, estimated by way of the median of the posterior distribution and normalised in a distributive manner (priorities add up to one), are a clear example of compositional data that will be used in the search for consensus between the actors involved in the resolution of the problem through the use of Multidimensional Scaling tools

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new approach to model and classify breast parenchymal tissue. Given a mammogram, first, we will discover the distribution of the different tissue densities in an unsupervised manner, and second, we will use this tissue distribution to perform the classification. We achieve this using a classifier based on local descriptors and probabilistic Latent Semantic Analysis (pLSA), a generative model from the statistical text literature. We studied the influence of different descriptors like texture and SIFT features at the classification stage showing that textons outperform SIFT in all cases. Moreover we demonstrate that pLSA automatically extracts meaningful latent aspects generating a compact tissue representation based on their densities, useful for discriminating on mammogram classification. We show the results of tissue classification over the MIAS and DDSM datasets. We compare our method with approaches that classified these same datasets showing a better performance of our proposal