968 resultados para Bi-level approaches


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Factor Markets Working Paper describes and highlights the key issues of farm capital structures, the dynamics of investments and accumulation of farm capital, and the financial leverage and borrowing rates on farms in selected European countries. Data collected from the Farm Account Data Network (FADN) suggest that the European farming sector uses quite different farm business strategies, capabilities to generate capital revenues, and segmented agricultural loan market regimes. Such diverse business strategies have substantial, and perhaps more substantial than expected, implications for the financial leverage and performance of farms. Different countries adopt different approaches to evaluating agricultural assets, or the agricultural asset markets simply differ substantially depending on the country in question. This has implications for most of the financial indicators. In those countries that have seen rapidly increasing asset prices at the margin, which were revised accordingly in the accounting systems for the whole stock of assets, firm values increased significantly, even though the firms had been disinvesting. If there is an asset price bubble and it bursts, there may be serious knock-on effects for some countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What is ‘the’ EU internal market, as economists see it? The present BEER paper attempts to survey and help readers understand various ‘economic’ approaches to the internal market idea. The paper starts with a conceptual discussion of what ‘the’ internal market is (in an economic perspective). Six different economic meanings of the internal market are presented, with the sixth one being the economic benchmark in an ideal setting. Subsequently, the question is asked what the internal market (i.e. its proper functioning) is good for. Put differently, the internal market in the EU Treaty is a means, but a means to what? Beyond the typical economic growth objectives of the Rome Treaty (still valid today, with some qualifications), other Treaty objectives have emerged. Economists typically think in means-end relationships and the instrumental role of the internal market for Treaty objectives is far from clear. The ‘new’ Commission internal market strategy of 2007 proposes a more goal-oriented internal market policy. Such a vision is more selective in picking intermediate objectives to which ‘the’ internal market should be instrumental, but it risks to ignore the major deficits in today’s internal market: services and labour! The means-end relationships get even more problematic once one begins to scrutinise all the socio-economic objectives of the current (Amsterdam/Nice) Treaty or still other intermediate objectives. The internal market (explicitly including the relevant common regulation) then becomes a ‘jack of all trades’ for the environment, a high level of social protection, innovation or ‘Social Europe’. These means/ends relationships often are ill-specified. The final section considers the future of the internal market, by distinguishing three strategies: incremental strategies (including the new internal market strategy of November 2007); the internal market as the core of the Economic Union serving the ‘proper functioning of the monetary union’; and deepening and widening of the internal market as justified by the functional subsidiarity test. Even though the latter two would seem to be preferable from an economic point of view, they currently lack political legitimacy and are therefore unlikely to be pursued in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sister chromatid cohesion, mediated by the cohesin complex, is essential for faithful mitosis. Nevertheless, evidence suggests that the surveillance mechanism that governs mitotic fidelity, the spindle assembly checkpoint (SAC), is not robust enough to halt cell division when cohesion loss occurs prematurely. The mechanism behind this poor response is not properly understood. Using developing Drosophila brains, we show that full sister chromatid separation elicits a weak checkpoint response resulting in abnormal mitotic exit after a short delay. Quantitative live-cell imaging approaches combined with mathematical modeling indicate that weak SAC activation upon cohesion loss is caused by weak signal generation. This is further attenuated by several feedback loops in the mitotic signaling network. We propose that multiple feedback loops involving cyclin-dependent kinase 1 (Cdk1) gradually impair error-correction efficiency and accelerate mitotic exit upon premature loss of cohesion. Our findings explain how cohesion defects may escape SAC surveillance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The C2 domain is one of the most frequent and widely distributed calcium-binding motifs. Its structure comprises an eight-stranded beta-sandwich with two structural types as if the result of a circular permutation. Combining sequence, structural and modelling information, we have explored, at different levels of granularity, the functional characteristics of several families of C2 domains. At the coarsest level,the similarity correlates with key structural determinants of the C2 domain fold and, at the finest level, with the domain architecture of the proteins containing them, highlighting the functional diversity between the various subfamilies. The functional diversity appears as different conserved surface patches throughout this common fold. In some cases, these patches are related to substrate-binding sites whereas in others they correspond to interfaces of presumably permanent interaction between other domains within the same polypeptide chain. For those related to substrate-binding sites, the predictions overlap with biochemical data in addition to providing some novel observations. For those acting as protein-protein interfaces' our modelling analysis suggests that slight variations between families are a result of not only complementary adaptations in the interfaces involved but also different domain architecture. In the light of the sequence and structural genomic projects, the work presented here shows that modelling approaches along with careful sub-typing of protein families will be a powerful combination for a broader coverage in proteomics. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Upstream AUGs (uAUGs) and upstream open reading frames (uORFs) are common features of mRNAs that encode regulatory proteins and have been shown to profoundly influence translation of the main ORF. In this study, we employed a series of artificial 5'-untranslated regions (5'-UTRs) containing one or more uAUGs/uORFs to systematically assess translation initiation at the main AUG by leaky scanning and reinitiation mechanisms. Constructs containing either one or two uAUGs in varying contexts but without an in-frame stop codon upstream of the main AUG were used to analyse the leaky scanning mechanism. This analysis largely confirmed the ranking of different AUG contextual sequences that was determined previously by Kozak. In addition, this ranking was the same for both the first and second uAUGs, although the magnitude of initiation efficiency differed. Moreover, similar to10% of ribosomes exhibited leaky scanning at uAUGs in the most favourable context and initiated at a downstream AUG. A second group of constructs containing different numbers of uORFs, each with optimal uAUGs, were used to measure the capacity for reinitiation. We found significant levels of initiation at the main ORF even in constructs containing four uORFs, with nearly 10% of ribosomes capable of reinitiating five times. This study shows that for mRNAs containing multiple uORFs/uAUGs, ribosome reinitiation and leaky scanning are efficient mechanisms for initiation at their main AUGs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we summarise key elements of retail change in Britain over a twenty-year period. The time period is that covered by a funded study into long-term change in grocery shopping habits in Portsmouth, England. The major empirical findings—to which we briefly allude—are reported elsewhere: the present task is to assess the wider context underlying that change. For example, it has frequently been stated that retailing in the UK is not as competitive as in other leading economies. As a result, the issue of consumer choice has become increasingly important politically. Concerns over concentration in the industry, new format development and market definition have been expressed by local planners, competition regulators and consumer groups. Macro level changes over time have also created market inequality in consumer opportunities at a local level—hence our decision to attempt a local-level study. Situational factors affecting consumer experiences over time at the local level involve the changing store choice sets available to particular consumers. Using actual consumer experiences thus becomes a yardstick for assessing the practical effectiveness of policy making. The paper demonstrates that choice at local level is driven by store use and that different levels of provision reflect real choice at the local level. Macro-level policy and ‘one size fits all’ approaches to regulation, it is argued, do not reflect the changing reality of grocery shopping. Accordingly, arguments for a more local and regional approach to regulation are made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-borne scatterometers are used to measure backscattered micro-wave radiation from the ocean surface. This data may be used to infer surface wind vectors where no direct measurements exist. Inherent in this data are outliers owing to aberrations on the water surface and measurement errors within the equipment. We present two techniques for identifying outliers using neural networks; the outliers may then be removed to improve models derived from the data. Firstly the generative topographic mapping (GTM) is used to create a probability density model; data with low probability under the model may be classed as outliers. In the second part of the paper, a sensor model with input-dependent noise is used and outliers are identified based on their probability under this model. GTM was successfully modified to incorporate prior knowledge of the shape of the observation manifold; however, GTM could not learn the double skinned nature of the observation manifold. To learn this double skinned manifold necessitated the use of a sensor model which imposes strong constraints on the mapping. The results using GTM with a fixed noise level suggested the noise level may vary as a function of wind speed. This was confirmed by experiments using a sensor model with input-dependent noise, where the variation in noise is most sensitive to the wind speed input. Both models successfully identified gross outliers with the largest differences between models occurring at low wind speeds. © 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the question of how to obtain effective fusion of identification information such that it is robust to the quality of this information. As well as technical issues data fusion is encumbered with a collection of (potentially confusing) practical considerations. These considerations are described during the early chapters in which a framework for data fusion is developed. Following this process of diversification it becomes clear that the original question is not well posed and requires more precise specification. We use the framework to focus on some of the technical issues relevant to the question being addressed. We show that fusion of hard decisions through use of an adaptive version of the maximum a posteriori decision rule yields acceptable performance. Better performance is possible using probability level fusion as long as the probabilities are accurate. Of particular interest is the prevalence of overconfidence and the effect it has on fused performance. The production of accurate probabilities from poor quality data forms the latter part of the thesis. Two approaches are taken. Firstly the probabilities may be moderated at source (either analytically or numerically). Secondly, the probabilities may be transformed at the fusion centre. In each case an improvement in fused performance is demonstrated. We therefore conclude that in order to obtain robust fusion care should be taken to model the probabilities accurately; either at the source or centrally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Productivity at the macro level is a complex concept but also arguably the most appropriate measure of economic welfare. Currently, there is limited research available on the various approaches that can be used to measure it and especially on the relative accuracy of said approaches. This thesis has two main objectives: firstly, to detail some of the most common productivity measurement approaches and assess their accuracy under a number of conditions and secondly, to present an up-to-date application of productivity measurement and provide some guidance on selecting between sometimes conflicting productivity estimates. With regards to the first objective, the thesis provides a discussion on the issues specific to macro-level productivity measurement and on the strengths and weaknesses of the three main types of approaches available, namely index-number approaches (represented by Growth Accounting), non-parametric distance functions (DEA-based Malmquist indices) and parametric production functions (COLS- and SFA-based Malmquist indices). The accuracy of these approaches is assessed through simulation analysis, which provided some interesting findings. Probably the most important were that deterministic approaches are quite accurate even when the data is moderately noisy, that no approaches were accurate when noise was more extensive, that functional form misspecification has a severe negative effect in the accuracy of the parametric approaches and finally that increased volatility in inputs and prices from one period to the next adversely affects all approaches examined. The application was based on the EU KLEMS (2008) dataset and revealed that the different approaches do in fact result in different productivity change estimates, at least for some of the countries assessed. To assist researchers in selecting between conflicting estimates, a new, three step selection framework is proposed, based on findings of simulation analyses and established diagnostics/indicators. An application of this framework is also provided, based on the EU KLEMS dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lack of discrimination power and poor weight dispersion remain major issues in Data Envelopment Analysis (DEA). Since the initial multiple criteria DEA (MCDEA) model developed in the late 1990s, only goal programming approaches; that is, the GPDEA-CCR and GPDEA-BCC were introduced for solving the said problems in a multi-objective framework. We found GPDEA models to be invalid and demonstrate that our proposed bi-objective multiple criteria DEA (BiO-MCDEA) outperforms the GPDEA models in the aspects of discrimination power and weight dispersion, as well as requiring less computational codes. An application of energy dependency among 25 European Union member countries is further used to describe the efficacy of our approach. © 2013 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To solve multi-objective problems, multiple reward signals are often scalarized into a single value and further processed using established single-objective problem solving techniques. While the field of multi-objective optimization has made many advances in applying scalarization techniques to obtain good solution trade-offs, the utility of applying these techniques in the multi-objective multi-agent learning domain has not yet been thoroughly investigated. Agents learn the value of their decisions by linearly scalarizing their reward signals at the local level, while acceptable system wide behaviour results. However, the non-linear relationship between weighting parameters of the scalarization function and the learned policy makes the discovery of system wide trade-offs time consuming. Our first contribution is a thorough analysis of well known scalarization schemes within the multi-objective multi-agent reinforcement learning setup. The analysed approaches intelligently explore the weight-space in order to find a wider range of system trade-offs. In our second contribution, we propose a novel adaptive weight algorithm which interacts with the underlying local multi-objective solvers and allows for a better coverage of the Pareto front. Our third contribution is the experimental validation of our approach by learning bi-objective policies in self-organising smart camera networks. We note that our algorithm (i) explores the objective space faster on many problem instances, (ii) obtained solutions that exhibit a larger hypervolume, while (iii) acquiring a greater spread in the objective space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excepting the Peripheral and Central Nervous Systems, the Immune System is the most complex of somatic systems in higher animals. This complexity manifests itself at many levels from the molecular to that of the whole organism. Much insight into this confounding complexity can be gained through computational simulation. Such simulations range in application from epitope prediction through to the modelling of vaccination strategies. In this review, we evaluate selectively various key applications relevant to computational vaccinology: these include technique that operates at different scale that is, from molecular to organisms and even to population level.