887 resultados para probability of occurrence


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The enzymatically catalyzed template-directed extension of ssDNA/primer complex is an impor-tant reaction of extraordinary complexity. The DNA polymerase does not merely facilitate the insertion of dNMP, but it also performs rapid screening of substrates to ensure a high degree of fidelity. Several kinetic studies have determined rate constants and equilibrium constants for the elementary steps that make up the overall pathway. The information is used to develop a macro-scopic kinetic model, using an approach described by Ninio [Ninio J., 1987. Alternative to the steady-state method: derivation of reaction rates from first-passage times and pathway probabili-ties. Proc. Natl. Acad. Sci. U.S.A. 84, 663–667]. The principle idea of the Ninio approach is to track a single template/primer complex over time and to identify the expected behavior. The average time to insert a single nucleotide is a weighted sum of several terms, in-cluding the actual time to insert a nucleotide plus delays due to polymerase detachment from ei-ther the ternary (template-primer-polymerase) or quaternary (+nucleotide) complexes and time delays associated with the identification and ultimate rejection of an incorrect nucleotide from the binding site. The passage times of all events and their probability of occurrence are ex-pressed in terms of the rate constants of the elementary steps of the reaction pathway. The model accounts for variations in the average insertion time with different nucleotides as well as the in-fluence of G+C content of the sequence in the vicinity of the insertion site. Furthermore the model provides estimates of error frequencies. If nucleotide extension is recognized as a compe-tition between successful insertions and time delaying events, it can be described as a binomial process with a probability distribution. The distribution gives the probability to extend a primer/template complex with a certain number of base pairs and in general it maps annealed complexes into extension products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a stochastic lattice model to describe the vector-borne disease (like yellow fever or dengue). The model is spatially structured and its dynamical rules take into account the diffusion of vectors. We consider a bipartite lattice, forming a sub-lattice of human and another occupied by mosquitoes. At each site of lattice we associate a stochastic variable that describes the occupation and the health state of a single individual (mosquito or human). The process of disease transmission in the human population follows a similar dynamic of the Susceptible-Infected-Recovered model (SIR), while the disease transmission in the mosquito population has an analogous dynamic of the Susceptible-Infected-Susceptible model (SIS) with mosquitos diffusion. The occurrence of an epidemic is directly related to the conditional probability of occurrence of infected mosquitoes (human) in the presence of susceptible human (mosquitoes) on neighborhood. The probability of diffusion of mosquitoes can facilitate the formation of pairs Susceptible-Infected enabling an increase in the size of the epidemic. Using an asynchronous dynamic update, we study the disease transmission in a population initially formed by susceptible individuals due to the introduction of a single mosquito (human) infected. We find that this model exhibits a continuous phase transition related to the existence or non-existence of an epidemic. By means of mean field approximations and Monte Carlo simulations we investigate the epidemic threshold and the phase diagram in terms of the diffusion probability and the infection probability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We combine phytoplankton occurrence data for 119 species from the continuous plankton recorder with climatological environmental variables in the North Atlantic to obtain ecological response functions of each species using the MaxEnt statistical method. These response functions describe how the probability of occurrence of each species changes as a function of environmental conditions and can be reduced to a simple description of phytoplankton realized niches using the mean and standard deviation of each environmental variable, weighted by its response function. Although there was substantial variation in the realized niche among species within groups, the envelope of the realized niches of North Atlantic diatoms and dinoflagellates are mostly separate in niche space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMEN El apoyo a la selección de especies a la restauración de la vegetación en España en los últimos 40 años se ha basado fundamentalmente en modelos de distribución de especies, también llamados modelos de nicho ecológico, que estiman la probabilidad de presencia de las especies en función de las condiciones del medio físico (clima, suelo, etc.). Con esta tesis se ha intentado contribuir a la mejora de la capacidad predictiva de los modelos introduciendo algunas propuestas metodológicas adaptadas a los datos disponibles actualmente en España y enfocadas al uso de los modelos en la selección de especies. No siempre se dispone de datos a una resolución espacial adecuada para la escala de los proyectos de restauración de la vegetación. Sin embrago es habitual contar con datos de baja resolución espacial para casi todas las especies vegetales presentes en España. Se propone un método de recalibración que actualiza un modelo de regresión logística de baja resolución espacial con una nueva muestra de alta resolución espacial. El método permite obtener predicciones de calidad aceptable con muestras relativamente pequeñas (25 presencias de la especie) frente a las muestras mucho mayores (más de 100 presencias) que requería una estrategia de modelización convencional que no usara el modelo previo. La selección del método estadístico puede influir decisivamente en la capacidad predictiva de los modelos y por esa razón la comparación de métodos ha recibido mucha atención en la última década. Los estudios previos consideraban a la regresión logística como un método inferior a técnicas más modernas como las de máxima entropía. Los resultados de la tesis demuestran que esa diferencia observada se debe a que los modelos de máxima entropía incluyen técnicas de regularización y la versión de la regresión logística usada en las comparaciones no. Una vez incorporada la regularización a la regresión logística usando penalización, las diferencias en cuanto a capacidad predictiva desaparecen. La regresión logística penalizada es, por tanto, una alternativa más para el ajuste de modelos de distribución de especies y está a la altura de los métodos modernos con mejor capacidad predictiva como los de máxima entropía. A menudo, los modelos de distribución de especies no incluyen variables relativas al suelo debido a que no es habitual que se disponga de mediciones directas de sus propiedades físicas o químicas. La incorporación de datos de baja resolución espacial proveniente de mapas de suelo nacionales o continentales podría ser una alternativa. Los resultados de esta tesis sugieren que los modelos de distribución de especies de alta resolución espacial mejoran de forma ligera pero estadísticamente significativa su capacidad predictiva cuando se incorporan variables relativas al suelo procedente de mapas de baja resolución espacial. La validación es una de las etapas fundamentales del desarrollo de cualquier modelo empírico como los modelos de distribución de especies. Lo habitual es validar los modelos evaluando su capacidad predictiva especie a especie, es decir, comparando en un conjunto de localidades la presencia o ausencia observada de la especie con las predicciones del modelo. Este tipo de evaluación no responde a una cuestión clave en la restauración de la vegetación ¿cuales son las n especies más idóneas para el lugar a restaurar? Se ha propuesto un método de evaluación de modelos adaptado a esta cuestión que consiste en estimar la capacidad de un conjunto de modelos para discriminar entre las especies presentes y ausentes de un lugar concreto. El método se ha aplicado con éxito a la validación de 188 modelos de distribución de especies leñosas orientados a la selección de especies para la restauración de la vegetación en España. Las mejoras metodológicas propuestas permiten mejorar la capacidad predictiva de los modelos de distribución de especies aplicados a la selección de especies en la restauración de la vegetación y también permiten ampliar el número de especies para las que se puede contar con un modelo que apoye la toma de decisiones. SUMMARY During the last 40 years, decision support tools for plant species selection in ecological restoration in Spain have been based on species distribution models (also called ecological niche models), that estimate the probability of occurrence of the species as a function of environmental predictors (e.g., climate, soil). In this Thesis some methodological improvements are proposed to contribute to a better predictive performance of such models, given the current data available in Spain and focusing in the application of the models to selection of species for ecological restoration. Fine grained species distribution data are required to train models to be used at the scale of the ecological restoration projects, but this kind of data are not always available for every species. On the other hand, coarse grained data are available for almost every species in Spain. A recalibration method is proposed that updates a coarse grained logistic regression model using a new fine grained updating sample. The method allows obtaining acceptable predictive performance with reasonably small updating sample (25 occurrences of the species), in contrast with the much larger samples (more than 100 occurrences) required for a conventional modeling approach that discards the coarse grained data. The choice of the statistical method may have a dramatic effect on model performance, therefore comparisons of methods have received much interest in the last decade. Previous studies have shown a poorer performance of the logistic regression compared to novel methods like maximum entropy models. The results of this Thesis show that the observed difference is caused by the fact that maximum entropy models include regularization techniques and the versions of logistic regression compared do not. Once regularization has been added to the logistic regression using a penalization procedure, the differences in model performance disappear. Therefore, penalized logistic regression may be considered one of the best performing methods to model species distributions. Usually, species distribution models do not consider soil related predictors because direct measurements of the chemical or physical properties are often lacking. The inclusion of coarse grained soil data from national or continental soil maps could be a reasonable alternative. The results of this Thesis suggest that the performance of the models slightly increase after including soil predictors form coarse grained soil maps. Model validation is a key stage of the development of empirical models, such as species distribution models. The usual way of validating is based on the evaluation of model performance for each species separately, i.e., comparing observed species presences or absence to predicted probabilities in a set of sites. This kind of evaluation is not informative for a common question in ecological restoration projects: which n species are the most suitable for the environment of the site to be restored? A method has been proposed to address this question that estimates the ability of a set of models to discriminate among present and absent species in a evaluation site. The method has been successfully applied to the validation of 188 species distribution models used to support decisions on species selection for ecological restoration in Spain. The proposed methodological approaches improve the predictive performance of the predictive models applied to species selection in ecological restoration and increase the number of species for which a model that supports decisions can be fitted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multivariate analysis on flood variables is needed to design some hydraulic structures like dams, as the complexity of the routing process in a reservoir requires a representation of the full hydrograph. In this work, a bivariate copula model was used to obtain the bivariate joint distribution of flood peak and volume, in order to know the probability of occurrence of a given inflow hydrograph. However, the risk of dam overtopping is given by the maximum water elevation reached during the routing process, which depends on the hydrograph variables, the reservoir volume and the spillway crest length. Consequently, an additional bivariate return period, the so-called routed return period, was defined in terms of risk of dam overtopping based on this maximum water elevation obtained after routing the inflow hydrographs. The theoretical return periods, which give the probability of occurrence of a hydrograph prior to accounting for the reservoir routing, were compared with the routed return period, as in both cases hydrographs with the same probability will draw a curve in the peak-volume space. The procedure was applied to the case study of the Santillana reservoir in Spain. Different reservoir volumes and spillway lengths were considered to investigate the influence of the dam and reservoir characteristics on the results. The methodology improves the estimation of the Design Flood Hydrograph and can be applied to assess the risk of dam overtopping

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La adecuada estimación de avenidas de diseño asociadas a altos periodos de retorno es necesaria para el diseño y gestión de estructuras hidráulicas como presas. En la práctica, la estimación de estos cuantiles se realiza normalmente a través de análisis de frecuencia univariados, basados en su mayoría en el estudio de caudales punta. Sin embargo, la naturaleza de las avenidas es multivariada, siendo esencial tener en cuenta características representativas de las avenidas, tales como caudal punta, volumen y duración del hidrograma, con el fin de llevar a cabo un análisis apropiado; especialmente cuando el caudal de entrada se transforma en un caudal de salida diferente durante el proceso de laminación en un embalse o llanura de inundación. Los análisis de frecuencia de avenidas multivariados han sido tradicionalmente llevados a cabo mediante el uso de distribuciones bivariadas estándar con el fin de modelar variables correlacionadas. Sin embargo, su uso conlleva limitaciones como la necesidad de usar el mismo tipo de distribuciones marginales para todas las variables y la existencia de una relación de dependencia lineal entre ellas. Recientemente, el uso de cópulas se ha extendido en hidrología debido a sus beneficios en relación al contexto multivariado, permitiendo superar los inconvenientes de las técnicas tradicionales. Una copula es una función que representa la estructura de dependencia de las variables de estudio, y permite obtener la distribución de frecuencia multivariada de dichas variables mediante sus distribuciones marginales, sin importar el tipo de distribución marginal utilizada. La estimación de periodos de retorno multivariados, y por lo tanto, de cuantiles multivariados, también se facilita debido a la manera en la que las cópulas están formuladas. La presente tesis doctoral busca proporcionar metodologías que mejoren las técnicas tradicionales usadas por profesionales para estimar cuantiles de avenida más adecuados para el diseño y la gestión de presas, así como para la evaluación del riesgo de avenida, mediante análisis de frecuencia de avenidas bivariados basados en cópulas. Las variables consideradas para ello son el caudal punta y el volumen del hidrograma. Con el objetivo de llevar a cabo un estudio completo, la presente investigación abarca: (i) el análisis de frecuencia de avenidas local bivariado centrado en examinar y comparar los periodos de retorno teóricos basados en la probabilidad natural de ocurrencia de una avenida, con el periodo de retorno asociado al riesgo de sobrevertido de la presa bajo análisis, con el fin de proporcionar cuantiles en una estación de aforo determinada; (ii) la extensión del enfoque local al regional, proporcionando un procedimiento completo para llevar a cabo un análisis de frecuencia de avenidas regional bivariado para proporcionar cuantiles en estaciones sin aforar o para mejorar la estimación de dichos cuantiles en estaciones aforadas; (iii) el uso de cópulas para investigar tendencias bivariadas en avenidas debido al aumento de los niveles de urbanización en una cuenca; y (iv) la extensión de series de avenida observadas mediante la combinación de los beneficios de un modelo basado en cópulas y de un modelo hidrometeorológico. Accurate design flood estimates associated with high return periods are necessary to design and manage hydraulic structures such as dams. In practice, the estimate of such quantiles is usually done via univariate flood frequency analyses, mostly based on the study of peak flows. Nevertheless, the nature of floods is multivariate, being essential to consider representative flood characteristics, such as flood peak, hydrograph volume and hydrograph duration to carry out an appropriate analysis; especially when the inflow peak is transformed into a different outflow peak during the routing process in a reservoir or floodplain. Multivariate flood frequency analyses have been traditionally performed by using standard bivariate distributions to model correlated variables, yet they entail some shortcomings such as the need of using the same kind of marginal distribution for all variables and the assumption of a linear dependence relation between them. Recently, the use of copulas has been extended in hydrology because of their benefits regarding dealing with the multivariate context, as they overcome the drawbacks of the traditional approach. A copula is a function that represents the dependence structure of the studied variables, and allows obtaining the multivariate frequency distribution of them by using their marginal distributions, regardless of the kind of marginal distributions considered. The estimate of multivariate return periods, and therefore multivariate quantiles, is also facilitated by the way in which copulas are formulated. The present doctoral thesis seeks to provide methodologies that improve traditional techniques used by practitioners, in order to estimate more appropriate flood quantiles for dam design, dam management and flood risk assessment, through bivariate flood frequency analyses based on the copula approach. The flood variables considered for that goal are peak flow and hydrograph volume. In order to accomplish a complete study, the present research addresses: (i) a bivariate local flood frequency analysis focused on examining and comparing theoretical return periods based on the natural probability of occurrence of a flood, with the return period associated with the risk of dam overtopping, to estimate quantiles at a given gauged site; (ii) the extension of the local to the regional approach, supplying a complete procedure for performing a bivariate regional flood frequency analysis to either estimate quantiles at ungauged sites or improve at-site estimates at gauged sites; (iii) the use of copulas to investigate bivariate flood trends due to increasing urbanisation levels in a catchment; and (iv) the extension of observed flood series by combining the benefits of a copula-based model and a hydro-meteorological model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existe una creciente preocupación por las catástrofes de origen natural que están por llegar, motivo por el que se están realizando estudios desde prácticamente todas las ramas de la ciencia. La razón para ello se puede encontrar en el miedo a que los eventos futuros puedan dificultar las actividades humanas, aunque no es el único factor. Por todo ello, se produce una dispersión muy importante incluso en los conceptos más elementales como qué debe ser considerado o cómo debe llamarse y catalogarse uno u otro elemento. En consecuencia, los métodos para comprender los riesgos naturales también son muy diferentes, rara vez encontrándose enfoques realmente multidisciplinares. Se han realizado algunos esfuerzos para crear un marco de entendimiento común como por ejemplo, la "Directiva sobre inundaciones" o, más recientemente, la Directiva Inspire. Las entidades aseguradoras y reaseguradoras son un actor importante entre los muchos involucrados en los estudios de riesgos. Su interés radica en el hecho de que terminan pagando la mayor parte de la factura, si no toda. Pero, a cuánto puede ascender esa factura, no es una pregunta fácil de responder aún en casos muy concretos, y sin embargo, es la pregunta que constantemente se plantea por parte de los tomadores de decisiones a todos los niveles. Este documento resume las actividades de investigación que han llevado a cabo al objeto de sentar un marco de referencia, implementando de enfoques numéricos capaces de hacer frente a algunas de las cuestiones más relevantes que se encuentran en casi todos los estudios de riesgos naturales, ensayando conceptos de manera pragmática. Para ello, se escogió un lugar experimental de acuerdo a diferentes criterios, como la densidad de población, la facilidad de proporcionar los límites geográficos claros, la presencia de tres de los procesos geológicos más importantes (inundaciones, terremotos y vulcanismo) y la disponibilidad de datos. El modelo aquí propuesto aprovecha fuentes de datos muy diversas para evaluar los peligros naturales, poniendo de relieve la necesidad de un enfoque multidisciplinar y emplea un catálogo de datos único, unificado, independiente (no orientado), coherente y homogéneo para estimar el valor de las propiedades. Ahora bien, los datos se explotan de manera diferente según cada tipo de peligro, manteniendo sin variación los conceptos subyacentes. Durante esta investigación, se ha encontrado una gran brecha en la relación entre las pérdidas reales y las probabilidades del peligro, algo contrario a lo que se ha pensado que debía ser el comportamiento más probable de los riesgos naturales, demostrando que los estudios de riesgo tienen vida útil muy limitada. En parte debido ello, el modelo propuesto en este estudio es el de trabajar con escenarios, fijando una probabilidad de ocurrencia, lo que es contrario al modelo clásico de evaluar funciones continuas de riesgo. Otra razón para abordar la cuestión mediante escenarios es forzar al modelo para proporcionar unas cifras creíbles de daño máximo fijando cuestiones como la ubicación espacial de un evento y sus probabilidades, aportando una nueva visión del "peor escenario posible” de probabilidad conocida. ABSTRACT There is a growing concern about catastrophes of natural origin about to come hence many studies are being carried out from almost any science branch. Even though it is not the only one, fear for the upcoming events that might jeopardize any given human activity is the main motive. A forking effect is therefore heavily present even on the basic concepts of what is to be considered or how should it be named and catalogued; as a consequence, methods towards understanding natural risks also show great differences and a multidisciplinary approach has seldomly been followed. Some efforts were made to create a common understanding of such a matter, the “Floods Directive” or more recently the Inspire Directive, are a couple of examples. The insurance sector is an important actor among the many involved. Their interest relies on the fact that, eventually, they pay most of the bill if not all. But how much could that be is not an easy question to be answerd even in a very specific case, and it is almost always the question posed by decision makers at all levels. This document summarizes research activities that have being carried out in order to put some solid ground to be followed, implementing numerical approaches that are capable of coping with some of the most relevant issues found in almost all natural risk studies, testing concepts pragmatically. In order to do so, an experimental site was selected according to different criteria, such as population density, the ease of providing clear geographical boundaries, the presence of three of the most important geological processes (floods, earthquakes and volcanism) and data availability. The model herein proposed takes advantage of very diferent data sources in the assessment of hazard, pointing out how a multidisciplinary approach is needed, and uses only one unified, independent, consistent, homogeneous (non objective driven) source for assessing property value. Data is exploited differently according to each hazard type, but the underlying concepts remain the same. During this research, a deep detachment was found between actual loss and hazard chances, contrarily to what has been thought to be the most likely behaviour of natural hazards, proving that risk studies have a very limited lifespan. Partially because of such finding, the model in this study addresses scenarios with fixed probability of occurrence, as opposed to studying a continuous hazard function as usually proposed. Another reason for studying scenarios was to force the model to provide a reliable figure after a set of given parameters where fixed, such as the spatial location of an event and its chances, so the “worst case” of a given return period could be found.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A engenharia é a ciência que transforma os conhecimentos das disciplinas básicas aplicadas a fatos reais. Nosso mundo está rodeado por essas realizações da engenharia, e é necessário que as pessoas se sintam confortáveis e seguras nas mesmas. Assim, a segurança se torna um fator importante que deve ser considerado em qualquer projeto. Na engenharia naval, um apropriado nível de segurança e, em consequência, um correto desenho estrutural é baseado, atualmente, em estudos determinísticos com o objetivo de obter estruturas capazes de suportar o pior cenário possível de solicitações durante um período de tempo determinado. A maior parte das solicitações na estrutura de um navio se deve à ação da natureza (ventos, ondas, correnteza e tempestades), ou, ainda, aos erros cometidos por humanos (explosões internas, explosões externas e colisões). Devido à aleatoriedade destes eventos, a confiabilidade estrutural de um navio deveria ser considerada como um problema estocástico sob condições ambientais bem caracterizadas. A metodologia probabilística, baseada em estatística e incertezas, oferece uma melhor perspectiva dos fenômenos reais que acontecem na estrutura dos navios. Esta pesquisa tem como objetivo apresentar resultados de confiabilidade estrutural em projetos e planejamento da manutenção para a chapa do fundo dos cascos dos navios, as quais são submetidas a esforços variáveis pela ação das ondas do mar e da corrosão. Foram estudados modelos estatísticos para a avaliação da estrutura da viga-navio e para o detalhe estrutural da chapa do fundo. Na avaliação da estrutura da viga-navio, o modelo desenvolvido consiste em determinar as probabilidades de ocorrência das solicitações na estrutura, considerando a deterioração por corrosão, com base numa investigação estatística da variação dos esforços em função das ondas e a deterioração em função de uma taxa de corrosão padrão recomendada pela DET NORSKE VERITAS (DNV). A abordagem para avaliação da confiabilidade dependente do tempo é desenvolvida com base nas curvas de resistências e solicitações (R-S) determinadas pela utilização do método de Monte Carlo. Uma variação estatística de longo prazo das adversidades é determinada pelo estudo estatístico de ondas em longo prazo e ajustada por uma distribuição com base numa vida de projeto conhecida. Constam no trabalho resultados da variação da confiabilidade ao longo do tempo de um navio petroleiro. O caso de estudo foi simplificado para facilitar a obtenção de dados, com o objetivo de corroborar a metodologia desenvolvida.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of presence/absence data in wildlife management and biological surveys is widespread. There is a growing interest in quantifying the sources of error associated with these data. We show that false-negative errors (failure to record a species when in fact it is present) can have a significant impact on statistical estimation of habitat models using simulated data. Then we introduce an extension of logistic modeling, the zero-inflated binomial (ZIB) model that permits the estimation of the rate of false-negative errors and the correction of estimates of the probability of occurrence for false-negative errors by using repeated. visits to the same site. Our simulations show that even relatively low rates of false negatives bias statistical estimates of habitat effects. The method with three repeated visits eliminates the bias, but estimates are relatively imprecise. Six repeated visits improve precision of estimates to levels comparable to that achieved with conventional statistics in the absence of false-negative errors In general, when error rates are less than or equal to50% greater efficiency is gained by adding more sites, whereas when error rates are >50% it is better to increase the number of repeated visits. We highlight the flexibility of the method with three case studies, clearly demonstrating the effect of false-negative errors for a range of commonly used survey methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effective clinical decision making depends upon identifying possible outcomes for a patient, selecting relevant cues, and processing the cues to arrive at accurate judgements of each outcome's probability of occurrence. These activities can be considered as classification tasks. This paper describes a new model of psychological classification that explains how people use cues to determine class or outcome likelihoods. It proposes that clinicians respond to conditional probabilities of outcomes given cues and that these probabilities compete with each other for influence on classification. The model explains why people appear to respond to base rates inappropriately, thereby overestimating the occurrence of rare categories, and a clinical example is provided for predicting suicide risk. The model makes an effective representation for expert clinical judgements and its psychological validity enables it to generate explanations in a form that is comprehensible to clinicians. It is a strong candidate for incorporation within a decision support system for mental-health risk assessment, where it can link with statistical and pattern recognition tools applied to a database of patients. The symbiotic combination of empirical evidence and clinical expertise can provide an important web-based resource for risk assessment, including multi-disciplinary education and training. © 2002 Informa UK Ltd All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.

For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.

Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.

Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.

In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.

For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.

Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente artículo es resultado de la investigación: “Diseño de un modelo para mejorar los procesos de estimación de costos para las empresas desarrolladoras de software”. Se presenta una revisión de la literatura a nivel internacional con el fin de identificar tendencias y métodos para realizar estimaciones de costos de software más exactas. Por medio del método predictivo Delphi, un conjunto de expertos pertenecientes al sector de software de Barranquilla clasificaron y valoraron según la probabilidad de ocurrencia cinco escenarios realistas de estimaciones. Se diseñó un experimento completamente aleatorio cuyos resultados apuntaron a dos escenarios estadísticamente similares de manera cualitativa, con lo que se construyó un modelo de análisis basado en tres agentes: Metodología, capacidad del equipo de trabajo y productos tecnológicos; cada uno con tres categorías de cumplimiento para lograr estimaciones más precisas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese (doutorado)—Universidade de Brasília, Universidade Federal da Paraíba, Universidade Federal do Rio Grande do Norte, Programa Multi-Institucional e Inter-Regional de Pós-Graduação em Ciências Contábeis, 2016.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le nerprun bourdaine (Rhamnus frangula L.) est une espèce exotique qui envahit plusieurs régions du sud du Québec, et plus particulièrement la région administrative de l'Estrie. Actuellement, on connaît encore peu l'écologie de l'espèce dans le contexte québécois et il n’existe pas de portrait d’ensemble de sa distribution dans les forêts tempérées de cette région. Dans ce contexte, le premier objectif du projet était de cartographier par télédétection la distribution du nerprun bourdaine dans deux secteurs de l'Estrie. Un second objectif était d'évaluer les variables environnementales déterminantes pour expliquer le recouvrement de nerprun bourdaine. La phénologie du nerprun bourdaine diffère de celle de la plupart des espèces indigènes arborescentes puisque ses feuilles tombent plus tard en automne. Cette caractéristique a permis de cartographier, par démixage spectral, la probabilité d'occurrence du nerprun bourdaine grâce à une série temporelle d'images du capteur OLI de Landsat 8. Le recouvrement du nerprun bourdaine a été calculé dans 119 placettes sur le terrain. La cartographie résultante a montré un accord de 69% avec les données terrain. Une image SPOT-7, dont la résolution spatiale est plus fine, a ensuite été utilisée, mais n’a pas permis d'améliorer la cartographie, puisque la date d’acquisition de l’image n’était pas optimale dû à un manque de disponibilité. Concernant le second objectif de la recherche, la variable la plus significative pour expliquer la présence de nerprun bourdaine était la densité du peuplement, ce qui suggère que l’ouverture de la couverture forestière pourrait favoriser l’envahissement. Néanmoins, les résultats tendent à démontrer que le nerprun bourdaine est une espèce «généraliste» qui s’adapte bien à plusieurs conditions environnementales.