254 resultados para Multiplier
Resumo:
In an attempt to understand why the Greek economy is collapsing, this Commentary points out two key aspects that are often overlooked – the country’s large multiplier and a bad export performance. When combined with the need for a large fiscal adjustment, these factors help explain how fiscal consolidation in Greece has been associated with such a large drop in GDP.
Resumo:
In this paper, a fuzzy Markov random field (FMRF) model is used to segment land-objects into free, grass, building, and road regions by fusing remotely, sensed LIDAR data and co-registered color bands, i.e. scanned aerial color (RGB) photo and near infra-red (NIR) photo. An FMRF model is defined as a Markov random field (MRF) model in a fuzzy domain. Three optimization algorithms in the FMRF model, i.e. Lagrange multiplier (LM), iterated conditional mode (ICM), and simulated annealing (SA), are compared with respect to the computational cost and segmentation accuracy. The results have shown that the FMRF model-based ICM algorithm balances the computational cost and segmentation accuracy in land-cover segmentation from LIDAR data and co-registered bands.
Resumo:
Airborne LIght Detection And Ranging (LIDAR) provides accurate height information for objects on the earth, which makes LIDAR become more and more popular in terrain and land surveying. In particular, LIDAR data offer vital and significant features for land-cover classification which is an important task in many application domains. In this paper, an unsupervised approach based on an improved fuzzy Markov random field (FMRF) model is developed, by which the LIDAR data, its co-registered images acquired by optical sensors, i.e. aerial color image and near infrared image, and other derived features are fused effectively to improve the ability of the LIDAR system for the accurate land-cover classification. In the proposed FMRF model-based approach, the spatial contextual information is applied by modeling the image as a Markov random field (MRF), with which the fuzzy logic is introduced simultaneously to reduce the errors caused by the hard classification. Moreover, a Lagrange-Multiplier (LM) algorithm is employed to calculate a maximum A posteriori (MAP) estimate for the classification. The experimental results have proved that fusing the height data and optical images is particularly suited for the land-cover classification. The proposed approach works very well for the classification from airborne LIDAR data fused with its coregistered optical images and the average accuracy is improved to 88.9%.
Resumo:
Military doctrine is one of the conceptual components of war. Its raison d’être is that of a force multiplier. It enables a smaller force to take on and defeat a larger force in battle. This article’s departure point is the aphorism of Sir Julian Corbett, who described doctrine as ‘the soul of warfare’. The second dimension to creating a force multiplier effect is forging doctrine with an appropriate command philosophy. The challenge for commanders is how, in unique circumstances, to formulate, disseminate and apply an appropriate doctrine and combine it with a relevant command philosophy. This can only be achieved by policy-makers and senior commanders successfully answering the Clausewitzian question: what kind of conflict are they involved in? Once an answer has been provided, a synthesis of these two factors can be developed and applied. Doctrine has implications for all three levels of war. Tactically, doctrine does two things: first, it helps to create a tempo of operations; second, it develops a transitory quality that will produce operational effect, and ultimately facilitate the pursuit of strategic objectives. Its function is to provide both training and instruction. At the operational level instruction and understanding are critical functions. Third, at the strategic level it provides understanding and direction. Using John Gooch’s six components of doctrine, it will be argued that there is a lacunae in the theory of doctrine as these components can manifest themselves in very different ways at the three levels of war. They can in turn affect the transitory quality of tactical operations. Doctrine is pivotal to success in war. Without doctrine and the appropriate command philosophy military operations cannot be successfully concluded against an active and determined foe.
Resumo:
Both the (5,3) counter and (2,2,3) counter multiplication techniques are investigated for the efficiency of their operation speed and the viability of the architectures when implemented in a fast bipolar ECL technology. The implementation of the counters in series-gated ECL and threshold logic are contrasted for speed, noise immunity and complexity, and are critically compared with the fastest practical design of a full-adder. A novel circuit technique to overcome the problems of needing high fan-in input weights in threshold circuits through the use of negative weighted inputs is presented. The authors conclude that a (2,2,3) counter based array multiplier implemented in series-gated ECL should enable a significant increase in speed over conventional full adder based array multipliers.
Resumo:
The authors compare various array multiplier architectures based on (p,q) counter circuits. The tradeoff in multiplier design is always between adding complexity and increasing speed. It is shown that by using a (2,2,3) counter cell it is possible to gain a significant increase in speed over a conventional full-adder, carry-save array based approach. The increase in complexity should be easily accommodated using modern emitter-coupled-logic processes.
Resumo:
One reason for the recent asset price bubbles in many developed countries could be regulatory capital arbitrage. Regulatory and legal changes can help traditional banks to move their assets off their balance sheets into the lightly regulated shadows and thus enable regulatory arbitrage through the securitized sector. This paper adopts a global vector autoregression (GVAR) methodology to assess the effects of regulatory capital arbitrage on equity prices, house prices and economic activity across 11 OECD countries/ regions. A counterfactual experiment disentangles the effects of regulatory arbitrage following a change in the net capital rule for investment banks in April 2004 and the adoption of the Basel II Accord in June 2004. The results provide evidence for the existence of an international finance multiplier, with about half of the countries overshooting U.S. impulse responses. The counterfactual shows that regulatory arbitrage via the U.S. securitized sector may enhance the cross-country reallocation of capital from housing markets towards equity markets.
Resumo:
This research examines the role of retailing in urban regeneration nationally and locally in the UK. The research uses data at a national level and local shopping centre case studies to examine the employment and property impacts of retailing. Focusing on schemes built during the first part of the 1990s it shows that retail can bring employment and economic benefits to town centres, but that the impact on the inner city should not be overlooked. Valuable lessons can be learned from the experience of centres built during this period of recession, and new challenges such as eCommerce now face these centres and others being developed today. The research examines the multiplier effect of retail regeneration schemes nationally using National Accounts data, and the local property and employment impacts of shopping centre schemes in the case study towns of Aberdeen, Bristol, Norwich, Bromley, Worcester and Leicester. The report includes valuable statistical sources, a full literature and policy review and will be of interest to those involved in property investment, regeneration and planning. The research was funded by the Office of Science & Technology and the Harold Samuel Educational Trust.
Resumo:
This independent research was commissioned by the British Property Federation. The report examines the local and national economic impact of two major, mixed use schemes in terms of tax revenue, household income, business rates and council tax and jobs creation. A regeneration balance sheet for each scheme is presented in the context of government policy and other related research. The report provides a comprehensive review of government policy and the role of retail and other land uses in regeneration. Highlighting the importance of national and local multiplier effects with detailed statistics drawn from a variety of sources, this fully illustrated colour research report builds up a detailed picture of economic impact of the mixed use regeneration schemes in the local economies of Birmingham and Portsmouth. The report will be of interest to property people, planners and all involved in regeneration and local economies.
Resumo:
Upscaling ecological information to larger scales in space and downscaling remote sensing observations or model simulations to finer scales remain grand challenges in Earth system science. Downscaling often involves inferring subgrid information from coarse-scale data, and such ill-posed problems are classically addressed using regularization. Here, we apply two-dimensional Tikhonov Regularization (2DTR) to simulate subgrid surface patterns for ecological applications. Specifically, we test the ability of 2DTR to simulate the spatial statistics of high-resolution (4 m) remote sensing observations of the normalized difference vegetation index (NDVI) in a tundra landscape. We find that the 2DTR approach as applied here can capture the major mode of spatial variability of the high-resolution information, but not multiple modes of spatial variability, and that the Lagrange multiplier (γ) used to impose the condition of smoothness across space is related to the range of the experimental semivariogram. We used observed and 2DTR-simulated maps of NDVI to estimate landscape-level leaf area index (LAI) and gross primary productivity (GPP). NDVI maps simulated using a γ value that approximates the range of observed NDVI result in a landscape-level GPP estimate that differs by ca 2% from those created using observed NDVI. Following findings that GPP per unit LAI is lower near vegetation patch edges, we simulated vegetation patch edges using multiple approaches and found that simulated GPP declined by up to 12% as a result. 2DTR can generate random landscapes rapidly and can be applied to disaggregate ecological information and compare of spatial observations against simulated landscapes.
Resumo:
In this series of papers, we study issues related to the synchronization of two coupled chaotic discrete systems arising from secured communication. The first part deals with uniform dissipativeness with respect to parameter variation via the Liapunov direct method. We obtain uniform estimates of the global attractor for a general discrete nonautonomous system, that yields a uniform invariance principle in the autonomous case. The Liapunov function is allowed to have positive derivative along solutions of the system inside a bounded set, and this reduces substantially the difficulty of constructing a Liapunov function for a given system. In particular, we develop an approach that incorporates the classical Lagrange multiplier into the Liapunov function method to naturally extend those Liapunov functions from continuous dynamical system to their discretizations, so that the corresponding uniform dispativeness results are valid when the step size of the discretization is small. Applications to the discretized Lorenz system and the discretization of a time-periodic chaotic system are given to illustrate the general results. We also show how to obtain uniform estimation of attractors for parametrized linear stable systems with nonlinear perturbation.
Resumo:
This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth subproblems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.
Resumo:
This paper studies a special class of vector smooth-transition autoregressive (VSTAR) models that contains common nonlinear features (CNFs), for which we proposed a triangular representation and developed a procedure of testing CNFs in a VSTAR model. We first test a unit root against a stable STAR process for each individual time series and then examine whether CNFs exist in the system by Lagrange Multiplier (LM) test if unit root is rejected in the first step. The LM test has standard Chi-squared asymptotic distribution. The critical values of our unit root tests and small-sample properties of the F form of our LM test are studied by Monte Carlo simulations. We illustrate how to test and model CNFs using the monthly growth of consumption and income data of United States (1985:1 to 2011:11).
Resumo:
Granting economic development incentives (or “EDIs”) has become commonplace throughout the United States, but the efficiency of these mechanisms is generally unwarranted. Both the politicians granting, and the companies seeking, EDIs have incentives to overestimate the EDIs benefits. For politicians, ribbon–cutting ceremonies can be the highly desirable opportunity to please political allies and financiers, and the same time that they demonstrate to the population that they are successful in promoting economic growth – even when the population would be better off otherwise. In turn, businesses are naturally prone to seek governmental aid. This explains in part why EDIs often “fail” (i.e. don’t pay–off). To increase transparency and mitigate the risk of EDI failure, local and state governments across the country have created a number of accountability mechanisms. The general trait of these accountability mechanisms is that they apply controls to some of the sub–risks that underlie the risk of EDI failure. These sub–risks include the companies receiving EDIs not generating the expected number of jobs, not investing enough in their local facilities, not attracting the expected additional businesses investments to the jurisdiction, etc. The problem with such schemes is that they tackle the problem of EDI failure very loosely. They are too narrow and leave multiplier effects uncontrolled. I propose novel contractual framework for implementing accountability mechanisms. My suggestion is to establish controls on the risk of EDI failure itself, leaving its underlying sub–risks uncontrolled. I call this mechanism “Contingent EDIs”, because the EDIs are made contingent on the government achieving a preset target that benchmarks the risk of EDI failure. If the target is met, the EDIs will ex post kick in; if not, then the EDIs never kick in.
Resumo:
The employees' partlclpation in the cultural transltIon process occurred in Companhia Siderúrgica de Tubarão (CST) was helpful in the identification of the group of measures that started to be managed in order to be established in the run of the control assumption. The company, in order to acquire proper features, had to change prior values, behaviors and identities through strategies shared by ali the organization members, thus, creating a new culture. The CST was seen as a company controlling and coordinating a group of people. It counted on vertical hierarchical leveis, departments and authority relations. It was neve r taken into account that the company could have its own personality, like each person that worked there. Before getting through this cultural transition process, that extremely changed its values, the company had a dominating culture, with a centralized administration. This way, it influenced the conduct of ali its members, in a controlling environment. When the company realized the necessity of investments in cultural changing programs, in order to eliminate the pathologies and disfunctions that were hitting its business structure, causing damage to productivity and to the quality of the results, it condensed energies in one direction implementing the participation of ali its leaderships as implementing and multiplier elements to orientate and facilitate the achievement of its goals. Trying to understand better the influences suffered by the changes brought by globalization and privatization, some theorical and operational concepts of culture and identity were developed in this study, mainly in the first chapters. In the research extension, several aspects of this complex anthropological and sociological concept of culture were managed, such as affectiveness, cognitive process, valuation process and everything that could be related to or that give elligibility to this concept and to the phenomenon this concept will consist in.