999 resultados para natural constraints


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paradigm of computational vision hypothesizes that any visual function -- such as the recognition of your grandparent -- can be replicated by computational processing of the visual input. What are these computations that the brain performs? What should or could they be? Working on the latter question, this dissertation takes the statistical approach, where the suitable computations are attempted to be learned from the natural visual data itself. In particular, we empirically study the computational processing that emerges from the statistical properties of the visual world and the constraints and objectives specified for the learning process. This thesis consists of an introduction and 7 peer-reviewed publications, where the purpose of the introduction is to illustrate the area of study to a reader who is not familiar with computational vision research. In the scope of the introduction, we will briefly overview the primary challenges to visual processing, as well as recall some of the current opinions on visual processing in the early visual systems of animals. Next, we describe the methodology we have used in our research, and discuss the presented results. We have included some additional remarks, speculations and conclusions to this discussion that were not featured in the original publications. We present the following results in the publications of this thesis. First, we empirically demonstrate that luminance and contrast are strongly dependent in natural images, contradicting previous theories suggesting that luminance and contrast were processed separately in natural systems due to their independence in the visual data. Second, we show that simple cell -like receptive fields of the primary visual cortex can be learned in the nonlinear contrast domain by maximization of independence. Further, we provide first-time reports of the emergence of conjunctive (corner-detecting) and subtractive (opponent orientation) processing due to nonlinear projection pursuit with simple objective functions related to sparseness and response energy optimization. Then, we show that attempting to extract independent components of nonlinear histogram statistics of a biologically plausible representation leads to projection directions that appear to differentiate between visual contexts. Such processing might be applicable for priming, \ie the selection and tuning of later visual processing. We continue by showing that a different kind of thresholded low-frequency priming can be learned and used to make object detection faster with little loss in accuracy. Finally, we show that in a computational object detection setting, nonlinearly gain-controlled visual features of medium complexity can be acquired sequentially as images are encountered and discarded. We present two online algorithms to perform this feature selection, and propose the idea that for artificial systems, some processing mechanisms could be selectable from the environment without optimizing the mechanisms themselves. In summary, this thesis explores learning visual processing on several levels. The learning can be understood as interplay of input data, model structures, learning objectives, and estimation algorithms. The presented work adds to the growing body of evidence showing that statistical methods can be used to acquire intuitively meaningful visual processing mechanisms. The work also presents some predictions and ideas regarding biological visual processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Semi-natural grasslands are the most important agricultural areas for biodiversity. The present study investigates the effects of traditional livestock grazing and mowing on plant species richness, the main emphasis being on cattle grazing in mesic semi-natural grasslands. The two reviews provide a thorough assessment of the multifaceted impacts and importance of grazing and mowing management to plant species richness. It is emphasized that livestock grazing and mowing have partially compensated the suppression of major natural disturbances by humans and mitigated the negative effects of eutrophication. This hypothesis has important consequences for nature conservation: A large proportion of European species originally adapted to natural disturbances may be at present dependent on livestock grazing and / or mowing. Furthermore, grazing and mowing are key management methods to mitigate effects of nutrient-enrichment. The species composition and richness in old (continuously grazed), new (grazing restarting 3-8 years ago) and abandoned (over 10 years) pastures differed consistently across a range of spatial scales, and was intermediate in new pastures compared to old and abandoned pastures. In mesic grasslands most plant species were shown to benefit from cattle grazing. Indicator species of biologically valuable grasslands and rare species were more abundant in grazed than in abandoned grasslands. Steep S-SW-facing slopes are the most suitable sites for many grassland plants and should be prioritized in grassland restoration. The proportion of species trait groups benefiting from grazing was higher in mesic semi-natural grasslands than in dry and wet grasslands. Consequently, species trait responses to grazing and the effectiveness of the natural factors limiting plant growth may be intimately linked High plant species richness of traditionally mowed and grazed areas is explained by numerous factors which operate on different spatial scales. Particularly important for maintaining large scale plant species richness are evolutionary and mitigation factors. Grazing and mowing cause a shift towards the conditions that have occurred during the evolutionary history of European plant species by modifying key ecological factors (nutrients, pH and light). The results of this Dissertation suggest that restoration of semi-natural grasslands by private farmers is potentially a useful method to manage biodiversity in the agricultural landscape. However, the quality of management is commonly improper, particularly due to financial constraints. For enhanced success of restoration, management regulations in the agri-environment scheme need to be defined more explicitly and the scheme should be revised to encourage management of biodiversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human activities extract and displace different substances and materials from the earth s crust, thus causing various environmental problems, such as climate change, acidification and eutrophication. As problems have become more complicated, more holistic measures that consider the origins and sources of pollutants have been called for. Industrial ecology is a field of science that forms a comprehensive framework for studying the interactions between the modern technological society and the environment. Industrial ecology considers humans and their technologies to be part of the natural environment, not separate from it. Industrial operations form natural systems that must also function as such within the constraints set by the biosphere. Industrial symbiosis (IS) is a central concept of industrial ecology. Industrial symbiosis studies look at the physical flows of materials and energy in local industrial systems. In an ideal IS, waste material and energy are exchanged by the actors of the system, thereby reducing the consumption of virgin material and energy inputs and the generation of waste and emissions. Companies are seen as part of the chains of suppliers and consumers that resemble those of natural ecosystems. The aim of this study was to analyse the environmental performance of an industrial symbiosis based on pulp and paper production, taking into account life cycle impacts as well. Life Cycle Assessment (LCA) is a tool for quantitatively and systematically evaluating the environmental aspects of a product, technology or service throughout its whole life cycle. Moreover, the Natural Step Sustainability Principles formed a conceptual framework for assessing the environmental performance of the case study symbiosis (Paper I). The environmental performance of the case study symbiosis was compared to four counterfactual reference scenarios in which the actors of the symbiosis operated on their own. The research methods used were process-based life cycle assessment (LCA) (Papers II and III) and hybrid LCA, which combines both process and input-output LCA (Paper IV). The results showed that the environmental impacts caused by the extraction and processing of the materials and the energy used by the symbiosis were considerable. If only the direct emissions and resource use of the symbiosis had been considered, less than half of the total environmental impacts of the system would have been taken into account. When the results were compared with the counterfactual reference scenarios, the net environmental impacts of the symbiosis were smaller than those of the reference scenarios. The reduction in environmental impacts was mainly due to changes in the way energy was produced. However, the results are sensitive to the way the reference scenarios are defined. LCA is a useful tool for assessing the overall environmental performance of industrial symbioses. It is recommended that in addition to the direct effects, the upstream impacts should be taken into account as well when assessing the environmental performance of industrial symbioses. Industrial symbiosis should be seen as part of the process of improving the environmental performance of a system. In some cases, it may be more efficient, from an environmental point of view, to focus on supply chain management instead.  

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two atmospheric inversions (one fine-resolved and one process-discriminating) and a process-based model for land surface exchanges are brought together to analyse the variations of methane emissions from 1990 to 2009. A focus is put on the role of natural wetlands and on the years 2000-2006, a period of stable atmospheric concentrations. From 1990 to 2000, the top-down and bottom-up visions agree on the time-phasing of global total and wetland emission anomalies. The process-discriminating inversion indicates that wetlands dominate the time-variability of methane emissions (90% of the total variability). The contribution of tropical wetlands to the anomalies is found to be large, especially during the post-Pinatubo years (global negative anomalies with minima between -41 and -19 Tg yr(-1) in 1992) and during the alternate 1997-1998 El-Nino/1998-1999 La-Nina (maximal anomalies in tropical regions between +16 and +22 Tg yr(-1) for the inversions and anomalies due to tropical wetlands between +12 and +17 Tg yr(-1) for the process-based model). Between 2000 and 2006, during the stagnation of methane concentrations in the atmosphere, the top-down and bottom-up approaches agree on the fact that South America is the main region contributing to anomalies in natural wetland emissions, but they disagree on the sign and magnitude of the flux trend in the Amazon basin. A negative trend (-3.9 +/- 1.3 Tg yr(-1)) is inferred by the process-discriminating inversion whereas a positive trend (+1.3 +/- 0.3 Tg yr(-1)) is found by the process model. Although processed-based models have their own caveats and may not take into account all processes, the positive trend found by the B-U approach is considered more likely because it is a robust feature of the process-based model, consistent with analysed precipitations and the satellite-derived extent of inundated areas. On the contrary, the surface-data based inversions lack constraints for South America. This result suggests the need for a re-interpretation of the large increase found in anthropogenic methane inventories after 2000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of separating a speech signal into its excitation and vocal-tract filter components, which falls within the framework of blind deconvolution. Typically, the excitation in case of voiced speech is assumed to be sparse and the vocal-tract filter stable. We develop an alternating l(p) - l(2) projections algorithm (ALPA) to perform deconvolution taking into account these constraints. The algorithm is iterative, and alternates between two solution spaces. The initialization is based on the standard linear prediction decomposition of a speech signal into an autoregressive filter and prediction residue. In every iteration, a sparse excitation is estimated by optimizing an l(p)-norm-based cost and the vocal-tract filter is derived as a solution to a standard least-squares minimization problem. We validate the algorithm on voiced segments of natural speech signals and show applications to epoch estimation. We also present comparisons with state-of-the-art techniques and show that ALPA gives a sparser impulse-like excitation, where the impulses directly denote the epochs or instants of significant excitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of flight is the most important feature of birds, and this ability has helped them become one of the most successful groups of vertebrates. However, some species have independently lost their ability to fly. The degeneration of flight abilit

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The constraint paradigm is a model of computation in which values are deduced whenever possible, under the limitation that deductions be local in a certain sense. One may visualize a constraint 'program' as a network of devices connected by wires. Data values may flow along the wires, and computation is performed by the devices. A device computes using only locally available information (with a few exceptions), and places newly derived values on other, locally attached wires. In this way computed values are propagated. An advantage of the constraint paradigm (not unique to it) is that a single relationship can be used in more than one direction. The connections to a device are not labelled as inputs and outputs; a device will compute with whatever values are available, and produce as many new values as it can. General theorem provers are capable of such behavior, but tend to suffer from combinatorial explosion; it is not usually useful to derive all the possible consequences of a set of hypotheses. The constraint paradigm places a certain kind of limitation on the deduction process. The limitations imposed by the constraint paradigm are not the only one possible. It is argued, however, that they are restrictive enough to forestall combinatorial explosion in many interesting computational situations, yet permissive enough to allow useful computations in practical situations. Moreover, the paradigm is intuitive: It is easy to visualize the computational effects of these particular limitations, and the paradigm is a natural way of expressing programs for certain applications, in particular relationships arising in computer-aided design. A number of implementations of constraint-based programming languages are presented. A progression of ever more powerful languages is described, complete implementations are presented and design difficulties and alternatives are discussed. The goal approached, though not quite reached, is a complete programming system which will implicitly support the constraint paradigm to the same extent that LISP, say, supports automatic storage management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In decision making problems where we need to choose a particular decision or alternative from a set of possible choices, we often have some preferences which determine if we prefer one decision over another. When these preferences give us an ordering on the decisions that is complete, then it is easy to choose the best or one of the best decisions. However it often occurs that the preferences relation is partially ordered, and we have no best decision. In this thesis, we look at what happens when we have such a partial order over a set of decisions, in particular when we have multiple orderings on a set of decisions, and we present a framework for qualitative decision making. We look at the different natural notions of optimal decision that occur in this framework, which gives us different optimality classes, and we examine the relationships between these classes. We then look in particular at a qualitative preference relation called Sorted-Pareto Dominance, which is an extension of Pareto Dominance, and we give a semantics for this relation as one that is compatible with any order-preserving mapping of an ordinal preference scale to a numerical one. We apply Sorted-Pareto dominance to a Soft Constraints setting, where we solve problems in which the soft constraints associate qualitative preferences to decisions in a decision problem. We also examine the Sorted-Pareto dominance relation in the context of our qualitative decision making framework, looking at the relevant optimality classes for the Sorted-Pareto case, which gives us classes of decisions that are necessarily optimal, and optimal for some choice of mapping of an ordinal scale to a quantitative one. We provide some empirical analysis of Sorted-Pareto constraints problems and examine the optimality classes that result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current global inventories of ammonia emissions identify the ocean as the largest natural source. This source depends on seawater pH, temperature, and the concentration of total seawater ammonia (NHx(sw)), which reflects a balance between remineralization of organic matter, uptake by plankton, and nitrification. Here we compare [NHx(sw)] from two global ocean biogeochemical models (BEC and COBALT) against extensive ocean observations. Simulated [NHx(sw)] are generally biased high. Improved simulation can be achieved in COBALT by increasing the plankton affinity for NHx within observed ranges. The resulting global ocean emissions is 2.5 TgN a−1, much lower than current literature values (7–23 TgN a−1), including the widely used Global Emissions InitiAtive (GEIA) inventory (8 TgN a−1). Such a weak ocean source implies that continental sources contribute more than half of atmospheric NHx over most of the ocean in the Northern Hemisphere. Ammonia emitted from oceanic sources is insufficient to neutralize sulfate aerosol acidity, consistent with observations. There is evidence over the Equatorial Pacific for a missing source of atmospheric ammonia that could be due to photolysis of marine organic nitrogen at the ocean surface or in the atmosphere. Accommodating this possible missing source yields a global ocean emission of ammonia in the range 2–5 TgN a−1, comparable in magnitude to other natural sources from open fires and soils.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present grizP1 light curves of 146 spectroscopically confirmed Type Ia supernovae (SNe Ia; 0.03 < z < 0.65) discovered during the first 1.5 yr of the Pan-STARRS1 Medium Deep Survey. The Pan-STARRS1 natural photometric system is determined by a combination of on-site measurements of the instrument response function and observations of spectrophotometric standard stars. We find that the systematic uncertainties in the photometric system are currently 1.2% without accounting for the uncertainty in the Hubble Space Telescope Calspec definition of the AB system. A Hubble diagram is constructed with a subset of 113 out of 146 SNe Ia that pass our light curve quality cuts. The cosmological fit to 310 SNe Ia (113 PS1 SNe Ia + 222 light curves from 197 low-z SNe Ia), using only supernovae (SNe) and assuming a constant dark energy equation of state and flatness, yields w = -1.120+0.360-0.206(Stat)+0.2690.291(Sys). When combined with BAO+CMB(Planck)+H0, the analysis yields ΩM = 0.280+0.0130.012 and w = -1.166+0.072-0.069 including all identified systematics. The value of w is inconsistent with the cosmological constant value of -1 at the 2.3σ level. Tension endures after removing either the baryon acoustic oscillation (BAO) or the H0 constraint, though it is strongest when including the H0 constraint. If we include WMAP9 cosmic microwave background (CMB) constraints instead of those from Planck, we find w = -1.124+0.083-0.065, which diminishes the discord to <2σ. We cannot conclude whether the tension with flat ΛCDM is a feature of dark energy, new physics, or a combination of chance and systematic errors. The full Pan-STARRS1 SN sample with ∼three times as many SNe should provide more conclusive results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we study the optimal natural gas commitment for a known demand scenario. This study implies the best location of GSUs to supply all demands and the optimal allocation from sources to gas loads, through an appropriate transportation mode, in order to minimize total system costs. Our emphasis is on the formulation and use of a suitable optimization model, reflecting real-world operations and the constraints of natural gas systems. The mathematical model is based on a Lagrangean heuristic, using the Lagrangean relaxation, an efficient approach to solve the problem. Computational results are presented for Iberian and American natural gas systems, geographically organized in 65 and 88 load nodes, respectively. The location model results, supported by the computational application GasView, show the optimal location and allocation solution, system total costs and suggest a suitable gas transportation mode, presented in both numerical and graphic supports.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a model of the regulator-regulated firm relationship in a regional natural gas commodity market which can be linked to a competitive market by a pipeline. We characterize normative policies under which the regulator, in addition to setting the level of the capacity of the pipeline, regulates the price of gas, under asymmetric information on the firm’s technology, and may (or may not) operate (two-way) transfers between consumers and the firm. We then focus on capacity and investigate how its level responds to the regulator’s taking account of the firm’s incentive compatibility constraints. The analysis yields some insights on the role that transport capacity investments may play as an instrument to improve the efficiency of geographically isolated markets.