935 resultados para structured-pragmaticsituational (SPS) approach
Resumo:
Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.
Resumo:
This thesis uses semi-structured interviews and documentary analysis to explore the impact of carbon sequestration rights on rural land in Queensland and to determine whether current rural valuation knowledge and practice is equipped to deal with these rights. The carbon right in Queensland is complex and subject to significant individual variation. The nature and form of this right will determine whether it has a positive or negative impact on Queensland rural land. Significant gaps in the knowledge of industry stakeholders, including rural valuers, concerning carbon rights were found, and recommendations for valuation practice were made.
Resumo:
This paper addresses the problem of discovering business process models from event logs. Existing approaches to this problem strike various tradeoffs between accuracy and understandability of the discovered models. With respect to the second criterion, empirical studies have shown that block-structured process models are generally more understandable and less error-prone than unstructured ones. Accordingly, several automated process discovery methods generate block-structured models by construction. These approaches however intertwine the concern of producing accurate models with that of ensuring their structuredness, sometimes sacrificing the former to ensure the latter. In this paper we propose an alternative approach that separates these two concerns. Instead of directly discovering a structured process model, we first apply a well-known heuristic technique that discovers more accurate but sometimes unstructured (and even unsound) process models, and then transform the resulting model into a structured one. An experimental evaluation shows that our “discover and structure” approach outperforms traditional “discover structured” approaches with respect to a range of accuracy and complexity measures.
Resumo:
A new approach for unwrapping phase maps, obtained during the measurement of 3-D surfaces using sinusoidal structured light projection technique, is proposed. "Takeda's method" is used to obtain the wrapped phase map. Proposed method of unwrapping makes use of an additional image of the object captured under the illumination of a specifically designed color-coded pattern. The new approach demonstrates, for the first time, a method of producing reliable unwrapping of objects even with surface discontinuities from a single-phase map. It is shown to be significantly faster and reliable than temporal phase unwrapping procedure that uses a complete exponential sequence. For example, if a measurement with the accuracy obtained by interrogating the object with S fringes in the projected pattern is carried out with both the methods, new method requires only 2 frames as compared to (log(2)S +1) frames required by the later method.
Resumo:
For structured-light scanners, the projective geometry between a projector-camera pair is identical to that of a camera-camera pair. Consequently, in conjunction with calibration, a variety of geometric relations are available for three-dimensional Euclidean reconstruction. In this paper, we use projector-camera epipolar properties and the projective invariance of the cross-ratio to solve for 3D geometry. A key contribution of our approach is the use of homographies induced by reference planes, along with a calibrated camera, resulting in a simple parametric representation for projector and system calibration. Compared to existing solutions that require an elaborate calibration process, our method is simple while ensuring geometric consistency. Our formulation using the invariance of the cross-ratio is also extensible to multiple estimates of 3D geometry that can be analysed in a statistical sense. The performance of our system is demonstrated on some cultural artifacts and geometric surfaces.
Resumo:
The method of structured programming or program development using a top-down, stepwise refinement technique provides a systematic approach for the development of programs of considerable complexity. The aim of this paper is to present the philosophy of structured programming through a case study of a nonnumeric programming task. The problem of converting a well-formed formula in first-order logic into prenex normal form is considered. The program has been coded in the programming language PASCAL and implemented on a DEC-10 system. The program has about 500 lines of code and comprises 11 procedures.
Resumo:
A nonlinear adaptive approach is presented to achieve rest-to-rest attitude maneuvers for spacecrafts in the presence of parameter uncertainties and unknown disturbances. A nonlinear controller, designed on the principle of dynamic inversion achieves the goals for the nominal model but suffers performance degradation in the presence of off-nominal parameter values and unwanted inputs. To address this issue, a model-following neuro-adaptive control design is carried out by taking the help of neural networks. Due to the structured approach followed here, the adaptation is restricted to the momentum level equations.The adaptive technique presented is computationally nonintensive and hence can be implemented in real-time. Because of these features, this new approach is named as structured model-following adaptive real-time technique (SMART). From simulation studies, this SMART approach is found to be very effective in achieving precision attitude maneuvers in the presence of parameter uncertainties and unknown disturbances.
Resumo:
The problem of identifying user intent has received considerable attention in recent years, particularly in the context of improving the search experience via query contextualization. Intent can be characterized by multiple dimensions, which are often not observed from query words alone. Accurate identification of Intent from query words remains a challenging problem primarily because it is extremely difficult to discover these dimensions. The problem is often significantly compounded due to lack of representative training sample. We present a generic, extensible framework for learning the multi-dimensional representation of user intent from the query words. The approach models the latent relationships between facets using tree structured distribution which leads to an efficient and convergent algorithm, FastQ, for identifying the multi-faceted intent of users based on just the query words. We also incorporated WordNet to extend the system capabilities to queries which contain words that do not appear in the training data. Empirical results show that FastQ yields accurate identification of intent when compared to a gold standard.
Resumo:
We revisit a problem studied by Padakandla and Sundaresan SIAM J. Optim., August 2009] on the minimization of a separable convex function subject to linear ascending constraints. The problem arises as the core optimization in several resource allocation problems in wireless communication settings. It is also a special case of an optimization of a separable convex function over the bases of a specially structured polymatroid. We give an alternative proof of the correctness of the algorithm of Padakandla and Sundaresan. In the process we relax some of their restrictions placed on the objective function.
Resumo:
What is the scope and responsibilities of design? This work partially answers this by employing a normative approach to design of a biomass cook stove. This study debates on the sufficiency of existing design methodologies in the light of a capability approach. A case study of a biomass cook stove Astra Ole has elaborated the theoretical constructs of capability approach, which, in turn, has structured insights from field to evaluate the product. Capability approach based methodology is also prescriptively used to design the mould for rapid dissemination of the Astra Ole.
Resumo:
International fisheries agencies recommend exploitation paths that satisfy two features. First, for precautionary reasons exploitation paths should avoid high fishing mortality in those fisheries where the biomass is depleted to a degree that jeopardise the stock's capacity to produce the Maximum Sustainable Yield (MSY). Second, for economic and social reasons, captures should be as stable (smooth) as possible over time. In this article we show that a conflict between these two interests may occur when seeking for optimal exploitation paths using age structured bioeconomic approach. Our results show that this conflict be overtaken by using non constant discount factors that value future stocks considering their relative intertemporal scarcity.
Resumo:
English: We describe an age-structured statistical catch-at-length analysis (A-SCALA) based on the MULTIFAN-CL model of Fournier et al. (1998). The analysis is applied independently to both the yellowfin and the bigeye tuna populations of the eastern Pacific Ocean (EPO). We model the populations from 1975 to 1999, based on quarterly time steps. Only a single stock for each species is assumed for each analysis, but multiple fisheries that are spatially separate are modeled to allow for spatial differences in catchability and selectivity. The analysis allows for error in the effort-fishing mortality relationship, temporal trends in catchability, temporal variation in recruitment, relationships between the environment and recruitment and between the environment and catchability, and differences in selectivity and catchability among fisheries. The model is fit to total catch data and proportional catch-at-length data conditioned on effort. The A-SCALA method is a statistical approach, and therefore recognizes that the data collected from the fishery do not perfectly represent the population. Also, there is uncertainty in our knowledge about the dynamics of the system and uncertainty about how the observed data relate to the real population. The use of likelihood functions allow us to model the uncertainty in the data collected from the population, and the inclusion of estimable process error allows us to model the uncertainties in the dynamics of the system. The statistical approach allows for the calculation of confidence intervals and the testing of hypotheses. We use a Bayesian version of the maximum likelihood framework that includes distributional constraints on temporal variation in recruitment, the effort-fishing mortality relationship, and catchability. Curvature penalties for selectivity parameters and penalties on extreme fishing mortality rates are also included in the objective function. The mode of the joint posterior distribution is used as an estimate of the model parameters. Confidence intervals are calculated using the normal approximation method. It should be noted that the estimation method includes constraints and priors and therefore the confidence intervals are different from traditionally calculated confidence intervals. Management reference points are calculated, and forward projections are carried out to provide advice for making management decisions for the yellowfin and bigeye populations. Spanish: Describimos un análisis estadístico de captura a talla estructurado por edad, A-SCALA (del inglés age-structured statistical catch-at-length analysis), basado en el modelo MULTIFAN- CL de Fournier et al. (1998). Se aplica el análisis independientemente a las poblaciones de atunes aleta amarilla y patudo del Océano Pacífico oriental (OPO). Modelamos las poblaciones de 1975 a 1999, en pasos trimestrales. Se supone solamente una sola población para cada especie para cada análisis, pero se modelan pesquerías múltiples espacialmente separadas para tomar en cuenta diferencias espaciales en la capturabilidad y selectividad. El análisis toma en cuenta error en la relación esfuerzo-mortalidad por pesca, tendencias temporales en la capturabilidad, variación temporal en el reclutamiento, relaciones entre el medio ambiente y el reclutamiento y entre el medio ambiente y la capturabilidad, y diferencias en selectividad y capturabilidad entre pesquerías. Se ajusta el modelo a datos de captura total y a datos de captura a talla proporcional condicionados sobre esfuerzo. El método A-SCALA es un enfoque estadístico, y reconoce por lo tanto que los datos obtenidos de la pesca no representan la población perfectamente. Además, hay incertidumbre en nuestros conocimientos de la dinámica del sistema e incertidumbre sobre la relación entre los datos observados y la población real. El uso de funciones de verosimilitud nos permite modelar la incertidumbre en los datos obtenidos de la población, y la inclusión de un error de proceso estimable nos permite modelar las incertidumbres en la dinámica del sistema. El enfoque estadístico permite calcular intervalos de confianza y comprobar hipótesis. Usamos una versión bayesiana del marco de verosimilitud máxima que incluye constreñimientos distribucionales sobre la variación temporal en el reclutamiento, la relación esfuerzo-mortalidad por pesca, y la capturabilidad. Se incluyen también en la función objetivo penalidades por curvatura para los parámetros de selectividad y penalidades por tasas extremas de mortalidad por pesca. Se usa la moda de la distribución posterior conjunta como estimación de los parámetros del modelo. Se calculan los intervalos de confianza usando el método de aproximación normal. Cabe destacar que el método de estimación incluye constreñimientos y distribuciones previas y por lo tanto los intervalos de confianza son diferentes de los intervalos de confianza calculados de forma tradicional. Se calculan puntos de referencia para el ordenamiento, y se realizan proyecciones a futuro para asesorar la toma de decisiones para el ordenamiento de las poblaciones de aleta amarilla y patudo.
Resumo:
An assessment of the total biomass of shortbelly rockfish (Sebastes jordani) off the central California coast is presented that is based on a spatially extensive but temporally restricted ichthyoplankton survey conducted during the 1991 spawning season. Contemporaneous samples of adults were obtained by trawl sampling in the study region. Daily larval production (7.56 × 1010 larvae/d) and the larval mortality rate (Z=0.11/d) during the cruise were estimated from a larval “catch curve,” wherein the logarithm of total age-specific larval abundance was regressed against larval age. For this analysis, larval age compositions at each of the 150 sample sites were determined by examination of otolith microstructure from subsampled larvae (n=2203), which were weighted by the polygonal Sette-Ahlstrom area surrounding each station. Female population weight-specific fecundity was estimated through a life table analysis that incorporated sex-specific differences in adult growth rate, female maturity, fecundity, and natural mortality (M). The resulting statistic (102.17 larvae/g) was insensitive to errors in estimating M and to the pattern of recruitment. Together, the two analyses indicated that a total biomass equal to 1366 metric tons (t)/d of age-1+ shortbelly rockfish (sexes combined) was needed to account for the observed level of spawning output during the cruise. Given the long-term seasonal distribution of spawning activity in the study area, as elucidated from a retrospective examination of California Cooperative Oceanic Fisheries Investigation (CalCOFI) ichthyoplankton samples from 1952 to 1984, the “daily” total biomass was expanded to an annual total of 67,392 t. An attempt to account for all sources of error in the derivation of this estimate was made by application of the delta-method, which yielded a coefficient of variation of 19%. The relatively high precision of this larval production method, and the rapidity with which an absolute biomass estimate can be obtained, establishes that, for some species of rockfish (Sebastes spp.), it is an attractive alternative to traditional age-structured stock assessments.
Resumo:
Many data are naturally modeled by an unobserved hierarchical structure. In this paper we propose a flexible nonparametric prior over unknown data hierarchies. The approach uses nested stick-breaking processes to allow for trees of unbounded width and depth, where data can live at any node and are infinitely exchangeable. One can view our model as providing infinite mixtures where the components have a dependency structure corresponding to an evolutionary diffusion down a tree. By using a stick-breaking approach, we can apply Markov chain Monte Carlo methods based on slice sampling to perform Bayesian inference and simulate from the posterior distribution on trees. We apply our method to hierarchical clustering of images and topic modeling of text data.
Resumo:
Structured precision modelling is an important approach to improve the intra-frame correlation modelling of the standard HMM, where Gaussian mixture model with diagonal covariance are used. Previous work has all been focused on direct structured representation of the precision matrices. In this paper, a new framework is proposed, where the structure of the Cholesky square root of the precision matrix is investigated, referred to as Cholesky Basis Superposition (CBS). Each Cholesky matrix associated with a particular Gaussian distribution is represented as a linear combination of a set of Gaussian independent basis upper-triangular matrices. Efficient optimization methods are derived for both combination weights and basis matrices. Experiments on a Chinese dictation task showed that the proposed approach can significantly outperformed the direct structured precision modelling with similar number of parameters as well as full covariance modelling. © 2011 IEEE.