823 resultados para Graph-based approach


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the North Pacific Ocean, an ecosystem-based fishery management approach has been adopted. A significant objective of this approach is to reduce interactions between fishery-related activities and protected species. We review management measures developed by the North Pacific Fishery Management Council and the National Marine Fisheries Service to reduce effects of the groundfish fisheries off Alaska on marine mammals and seabirds, while continuing to provide economic opportunities for fishery participants. Direct measures have been taken to mitigate known fishery impacts, and precautionary measures have been taken for species with potential (but no documented) interactions with the groundfish fisheries. Area closures limit disturbance to marine mammals at rookeries and haulouts, protect sensitive benthic habitat, and reduce potential competition for prey resources. Temporal and spatial dispersion of catches reduce the localized impact of fishery removals. Seabird avoidance measures have been implemented through collaboration with fishery participants and have been highly successful in reducing seabird bycatch. Finally, a comprehensive observer monitoring program provides data on the location and extent of bycatch of marine mammals and seabirds. These measures provide managers with the flexibility to adapt to changes in the status of protected species and evolving conditions in the fisheries. This review should be useful to fishery managers as an example of an ecosystem-based approach to protected species management that is adaptive and accounts for multiple objectives.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the face of dramatic declines in groundfish populations and a lack of sufficient stock assessment information, a need has arisen for new methods of assessing groundfish populations. We describe the integration of seafloor transect data gathered by a manned submersible with high-resolution sonar imagery to produce a habitat-based stock assessment system for groundfish. The data sets used in this study were collected from Heceta Bank, Oregon, and were derived from 42 submersible dives (1988–90) and a multibeam sonar survey (1998). The submersible habitat survey investigated seafloor topography and groundfish abundance along 30-minute transects over six predetermined stations and found a statistical relationship between habitat variability and groundfish distribution and abundance. These transects were analyzed in a geographic information system (GIS) by using dynamic segmentation to display changes in habitat along the transects. We used the submersible data to extrapolate fish abundance within uniform habitat patches over broad areas of the bank by means of a habitat classification based on the sonar imagery. After applying a navigation correction to the submersible-based habitat segments, a good correlation with major boundaries on the backscatter and topographic boundaries on the imagery were apparent. Extrapolation of the extent of uniform habitats was made in the vicinity of the dive stations and a preliminary stock assessment of several species of demersal fish was calculated. Such a habitat-based approach will allow researchers to characterize marine communities over large areas of the seafloor.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a new haplotype-based approach for inferring local genetic ancestry of individuals in an admixed population. Most existing approaches for local ancestry estimation ignore the latent genetic relatedness between ancestral populations and treat them as independent. In this article, we exploit such information by building an inheritance model that describes both the ancestral populations and the admixed population jointly in a unified framework. Based on an assumption that the common hypothetical founder haplotypes give rise to both the ancestral and the admixed population haplotypes, we employ an infinite hidden Markov model to characterize each ancestral population and further extend it to generate the admixed population. Through an effective utilization of the population structural information under a principled nonparametric Bayesian framework, the resulting model is significantly less sensitive to the choice and the amount of training data for ancestral populations than state-of-the-art algorithms. We also improve the robustness under deviation from common modeling assumptions by incorporating population-specific scale parameters that allow variable recombination rates in different populations. Our method is applicable to an admixed population from an arbitrary number of ancestral populations and also performs competitively in terms of spurious ancestry proportions under a general multiway admixture assumption. We validate the proposed method by simulation under various admixing scenarios and present empirical analysis results from a worldwide-distributed dataset from the Human Genome Diversity Project.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article we call for a new approach to patient safety improvement, one based on the emerging field of evidence-based healthcare risk management (EBHRM). We explore EBHRM in the broader context of the evidence-based healthcare movement, assess the benefits and challenges that might arise in adopting an evidence-based approach, and make recommendations for meeting those challenges and realizing the benefits of a more scientific approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Space heating accounts for a large portion of the world's carbon dioxide emissions. Ground Source Heat Pumps (GSHPs) are a technology which can reduce carbon emissions from heating and cooling. GSHP system performance is however highly sensitive to deviation from design values of the actual annual energy extraction/rejection rates from/to the ground. In order to prevent failure and/or performance deterioration of GSHP systems it is possible to incorporate a safety factor in the design of the GSHP by over-sizing the ground heat exchanger (GHE). A methodology to evaluate the financial risk involved in over-sizing the GHE is proposed is this paper. A probability based approach is used to evaluate the economic feasibility of a hypothetical full-size GSHP system as compared to four alternative Heating Ventilation and Air Conditioning (HVAC) system configurations. The model of the GSHP system is developed in the TRNSYS energy simulation platform and calibrated with data from an actual hybrid GSHP system installed in the Department of Earth Science, University of Oxford, UK. Results of the analysis show that potential savings from a full-size GSHP system largely depend on projected HVAC system efficiencies and gas and electricity prices. Results of the risk analysis also suggest that a full-size GSHP with auxiliary back up is potentially the most economical system configuration. © 2012 Elsevier Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Statistical approaches for building non-rigid deformable models, such as the Active Appearance Model (AAM), have enjoyed great popularity in recent years, but typically require tedious manual annotation of training images. In this paper, a learning based approach for the automatic annotation of visually deformable objects from a single annotated frontal image is presented and demonstrated on the example of automatically annotating face images that can be used for building AAMs for fitting and tracking. This approach employs the idea of initially learning the correspondences between landmarks in a frontal image and a set of training images with a face in arbitrary poses. Using this learner, virtual images of unseen faces at any arbitrary pose for which the learner was trained can be reconstructed by predicting the new landmark locations and warping the texture from the frontal image. View-based AAMs are then built from the virtual images and used for automatically annotating unseen images, including images of different facial expressions, at any random pose within the maximum range spanned by the virtually reconstructed images. The approach is experimentally validated by automatically annotating face images from three different databases. © 2009 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Infrastructure project sustainability assessment typically entails the use of specialised assessment tools to measure and rate project performance against a set of criteria. This paper looks beyond the prevailing approaches to sustainability assessments and explores sustainability principles in terms of project risks and opportunities. Taking a risk management approach to applying sustainability concepts to projects has the potential to reconceptualise decision structures for sustainability from bespoke assessments to becoming a standard part of the project decisionmaking process. By integrating issues of sustainability into project risk management for project planning, design and construction, sustainability is considered within a more traditional business and engineering language. Currently, there is no widely practised approach for objectively considering the environmental and social context of projects alongside the more traditional project risk assessments of time, cost and quality. A risk-based approach would not solve all the issues associated with existing sustainability assessments but it would place sustainability concerns alongside other key risks and opportunities, integrating sustainability with other project decisions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A simple, sensitive fluorescent method for detecting cyanide has been developed based on the inner filter effect (IFE) of silver nanoparticles (Ag NPs). With a high extinction coefficient and tunable plasmon absorption feature, Ag NPs are expected to be a powerful absorber to tune the emission of the fluorophore in the IFE-based fluorescent assays. In the present work, we developed a turn-on fluorescent assay for cyanide based on the strong absorption of Ag NPs to both excitation and emission light of an isolated fluorescence indicator. In the presence of cyanide, the absorber Ag NPs will dissolve gradually, which then leads to recovery of the IFE-decreased emission of the fluorophore. The concentration of Ag NPs in the detection system was found to affect the fluorescence response toward cyanide greatly. Under the optimum conditions, the present IFE-based approach can detect cyanide ranging from 5.0 x 10 (7) to 6.0 x 10 (4) M with a detection limit of 2.5 x 10 (7) M, which is much lower than the corresponding absorbance-based approach and compares favorably with other reported fluorescent methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis introduces elements of a theory of design activity and a computational framework for developing design systems. The theory stresses the opportunistic nature of designing and the complementary roles of focus and distraction, the interdependence of evaluation and generation, the multiplicity of ways of seeing over the history of a design session versus the exclusivity of a given way of seeing over an arbitrarily short period, and the incommensurability of criteria used to evaluate a design. The thesis argues for a principle based rather than rule based approach to designing documents. The Discursive Generator is presented as a computational framework for implementing specific design systems, and a simple system for arranging blocks according to a set of formal principles is developed by way of illustration. Both shape grammars and constraint based systems are used to contrast current trends in design automation with the discursive approach advocated in the thesis. The Discursive Generator is shown to have some important properties lacking in other types of systems, such as dynamism, robustness and the ability to deal with partial designs. When studied in terms of a search metaphor, the Discursive Generator is shown to exhibit behavior which is radically different from some traditional search techniques, and to avoid some of the well-known difficulties associated with them.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present an image-based approach to infer 3D structure parameters using a probabilistic "shape+structure'' model. The 3D shape of a class of objects may be represented by sets of contours from silhouette views simultaneously observed from multiple calibrated cameras. Bayesian reconstructions of new shapes can then be estimated using a prior density constructed with a mixture model and probabilistic principal components analysis. We augment the shape model to incorporate structural features of interest; novel examples with missing structure parameters may then be reconstructed to obtain estimates of these parameters. Model matching and parameter inference are done entirely in the image domain and require no explicit 3D construction. Our shape model enables accurate estimation of structure despite segmentation errors or missing views in the input silhouettes, and works even with only a single input view. Using a dataset of thousands of pedestrian images generated from a synthetic model, we can perform accurate inference of the 3D locations of 19 joints on the body based on observed silhouette contours from real images.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recognizing standard computational structures (cliches) in a program can help an experienced programmer understand the program. We develop a graph parsing approach to automating program recognition in which programs and cliches are represented in an attributed graph grammar formalism and recognition is achieved by graph parsing. In studying this approach, we evaluate our representation's ability to suppress many common forms of variation which hinder recognition. We investigate the expressiveness of our graph grammar formalism for capturing programming cliches. We empirically and analytically study the computational cost of our recognition approach with respect to two medium-sized, real-world simulator programs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

X. Wang, J. Yang, X. Teng, W. Xia, and R. Jensen. Feature Selection based on Rough Sets and Particle Swarm Optimization. Pattern Recognition Letters, vol. 28, no. 4, pp. 459-471, 2007.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A method for reconstruction of 3D polygonal models from multiple views is presented. The method uses sampling techniques to construct a texture-mapped semi-regular polygonal mesh of the object in question. Given a set of views and segmentation of the object in each view, constructive solid geometry is used to build a visual hull from silhouette prisms. The resulting polygonal mesh is simplified and subdivided to produce a semi-regular mesh. Regions of model fit inaccuracy are found by projecting the reference images onto the mesh from different views. The resulting error images for each view are used to compute a probability density function, and several points are sampled from it. Along the epipolar lines corresponding to these sampled points, photometric consistency is evaluated. The mesh surface is then pulled towards the regions of higher photometric consistency using free-form deformations. This sampling-based approach produces a photometrically consistent solution in much less time than possible with previous multi-view algorithms given arbitrary camera placement.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

(This Technical Report revises TR-BUCS-2003-011) The Transmission Control Protocol (TCP) has been the protocol of choice for many Internet applications requiring reliable connections. The design of TCP has been challenged by the extension of connections over wireless links. In this paper, we investigate a Bayesian approach to infer at the source host the reason of a packet loss, whether congestion or wireless transmission error. Our approach is "mostly" end-to-end since it requires only one long-term average quantity (namely, long-term average packet loss probability over the wireless segment) that may be best obtained with help from the network (e.g. wireless access agent).Specifically, we use Maximum Likelihood Ratio tests to evaluate TCP as a classifier of the type of packet loss. We study the effectiveness of short-term classification of packet errors (congestion vs. wireless), given stationary prior error probabilities and distributions of packet delays conditioned on the type of packet loss (measured over a larger time scale). Using our Bayesian-based approach and extensive simulations, we demonstrate that congestion-induced losses and losses due to wireless transmission errors produce sufficiently different statistics upon which an efficient online error classifier can be built. We introduce a simple queueing model to underline the conditional delay distributions arising from different kinds of packet losses over a heterogeneous wired/wireless path. We show how Hidden Markov Models (HMMs) can be used by a TCP connection to infer efficiently conditional delay distributions. We demonstrate how estimation accuracy is influenced by different proportions of congestion versus wireless losses and penalties on incorrect classification.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There has been considerable work done in the study of Web reference streams: sequences of requests for Web objects. In particular, many studies have looked at the locality properties of such streams, because of the impact of locality on the design and performance of caching and prefetching systems. However, a general framework for understanding why reference streams exhibit given locality properties has not yet emerged. In this work we take a first step in this direction, based on viewing the Web as a set of reference streams that are transformed by Web components (clients, servers, and intermediaries). We propose a graph-based framework for describing this collection of streams and components. We identify three basic stream transformations that occur at nodes of the graph: aggregation, disaggregation and filtering, and we show how these transformations can be used to abstract the effects of different Web components on their associated reference streams. This view allows a structured approach to the analysis of why reference streams show given properties at different points in the Web. Applying this approach to the study of locality requires good metrics for locality. These metrics must meet three criteria: 1) they must accurately capture temporal locality; 2) they must be independent of trace artifacts such as trace length; and 3) they must not involve manual procedures or model-based assumptions. We describe two metrics meeting these criteria that each capture a different kind of temporal locality in reference streams. The popularity component of temporal locality is captured by entropy, while the correlation component is captured by interreference coefficient of variation. We argue that these metrics are more natural and more useful than previously proposed metrics for temporal locality. We use this framework to analyze a diverse set of Web reference traces. We find that this framework can shed light on how and why locality properties vary across different locations in the Web topology. For example, we find that filtering and aggregation have opposing effects on the popularity component of the temporal locality, which helps to explain why multilevel caching can be effective in the Web. Furthermore, we find that all transformations tend to diminish the correlation component of temporal locality, which has implications for the utility of different cache replacement policies at different points in the Web.