935 resultados para Minimum norm
Resumo:
Adapting to blurred images makes in-focus images look too sharp, and vice-versa (Webster et al, 2002 Nature Neuroscience 5 839 - 840). We asked how such blur adaptation is related to contrast adaptation. Georgeson (1985 Spatial Vision 1 103 - 112) found that grating contrast adaptation followed a subtractive rule: perceived (matched) contrast of a grating was fairly well predicted by subtracting some fraction k(~0.3) of the adapting contrast from the test contrast. Here we apply that rule to the responses of a set of spatial filters at different scales and orientations. Blur is encoded by the pattern of filter response magnitudes over scale. We tested two versions - the 'norm model' and 'fatigue model' - against blur-matching data obtained after adaptation to sharpened, in-focus or blurred images. In the fatigue model, filter responses are simply reduced by exposure to the adapter. In the norm model, (a) the visual system is pre-adapted to a focused world and (b) discrepancy between observed and expected responses to the experimental adapter leads to additional reduction (or enhancement) of filter responses during experimental adaptation. The two models are closely related, but only the norm model gave a satisfactory account of results across the four experiments analysed, with one free parameter k. This model implies that the visual system is pre-adapted to focused images, that adapting to in-focus or blank images produces no change in adaptation, and that adapting to sharpened or blurred images changes the state of adaptation, leading to changes in perceived blur or sharpness.
Resumo:
Conventional feed forward Neural Networks have used the sum-of-squares cost function for training. A new cost function is presented here with a description length interpretation based on Rissanen's Minimum Description Length principle. It is a heuristic that has a rough interpretation as the number of data points fit by the model. Not concerned with finding optimal descriptions, the cost function prefers to form minimum descriptions in a naive way for computational convenience. The cost function is called the Naive Description Length cost function. Finding minimum description models will be shown to be closely related to the identification of clusters in the data. As a consequence the minimum of this cost function approximates the most probable mode of the data rather than the sum-of-squares cost function that approximates the mean. The new cost function is shown to provide information about the structure of the data. This is done by inspecting the dependence of the error to the amount of regularisation. This structure provides a method of selecting regularisation parameters as an alternative or supplement to Bayesian methods. The new cost function is tested on a number of multi-valued problems such as a simple inverse kinematics problem. It is also tested on a number of classification and regression problems. The mode-seeking property of this cost function is shown to improve prediction in time series problems. Description length principles are used in a similar fashion to derive a regulariser to control network complexity.
Resumo:
The sectoral and occupational structure of Britain and West Germany has increasingly changed over the last fifty years from a manual manufacturing based to a non-manual service sector based one. There has been a trend towards more managerial and less menial type occupations. Britain employs a higher proportion of its population in the service sector than in manufacturing compared to West Germany, except in retailing, where West Germany employs twice as many people as Britain. This is a stable sector of the economy in terms of employment, but the requirements of the workforce have changed in line with changes in the industry in both countries. School leavers in the two countries, faced with the same options (FE, training schemes or employment) have opted for the various options in different proportions: young Germans are staying longer in education before embarking on training and young Britons are now less likely to go straight into employment than ten years ago. Training is becoming more accepted as the normal route into employment with government policy leading the way, but public opinion still slow to respond. This study investigates how vocational training has adapted to the changing requirements of industry, often determined by technological advancements. In some areas e.g. manufacturing industry the changes have been radical, in others such as retailing they have not, but skill requirements, not necessarily influenced by technology have changed. Social-communicative skills, frequently not even considered skills and therefore not included in training are coming to the forefront. Vocational training has adapted differently in the two countries: in West Germany on the basis of an established over-defined system and in Britain on the basis of an out-dated ill-defined and almost non-existent system. In retailing German school leavers opt for two or three year apprenticeships whereas British school leavers are offered employment with or without formalised training. The publicly held view of the occupation of sales assistant is one of low-level skill, low intellectual demands and a job anyone can do. The traditional skills - product knowledge, selling and social-communicative skills have steadily been eroded. In the last five years retailers have recognised that a return to customer service, utilising the traditional skills was going to be needed of their staff to remain competitive. This requires training. The German retail training system responded by adapting its training regulations in a long consultative process, whereas the British experimented with YTS, a formalised training scheme nationwide being a new departure. The thesis evaluates the changes in these regulations. The case studies in four retail outlets demonstrate that it is indeed product knowledge and selling and social-communicative skills which are fundamental to being a successful and content sales assistant in either country. When the skills are recognised and taught well and systematically the foundations for career development in retailing are laid in a labour market which is continually looking for better qualified workers. Training, when planned and conducted professionally is appreciated by staff and customers and of benefit to the company. In retailing not enough systematic training, to recognisable standards is carried out in Britain, whereas in West Germany the training system is nevertheless better prepared to show innovative potential as a structure and is in place on which to build. In Britain the reputation of the individual company has a greater role to play, not ensuring a national provision of good training in retailing.
Resumo:
Using a well-established analytic nonlinear signal-to-noise ratio noise model we show that there are very simple, fibre independent, amplifier gains which minimize the total energy requirement for amplified systems. Power savings of over 50% are shown to be possible by choosing appropriate amplifier gain and output power.
Resumo:
A vision system is applied to full-field displacements and deformation measurements in solid mechanics. A speckle like pattern is preliminary formed on the surface under investigation. To determine displacements field of one speckle image with respect to a reference speckle image, sub-images, referred to Zones Of Interest (ZOI) are considered. The field is obtained by matching a ZOI in the reference image with the respective ZOI in the moved image. Two image processing techniques are used for implementing the matching procedure: – cross correlation function and minimum mean square error (MMSE) of the ZOI intensity distribution. The two algorithms are compared and the influence of the ZOI size on the accuracy of measurements is studied.
Resumo:
This paper presents a novel approach to the computation of primitive geometrical structures, where no prior knowledge about the visual scene is available and a high level of noise is expected. We based our work on the grouping principles of proximity and similarity, of points and preliminary models. The former was realized using Minimum Spanning Trees (MST), on which we apply a stable alignment and goodness of fit criteria. As for the latter, we used spectral clustering of preliminary models. The algorithm can be generalized to various model fitting settings, without tuning of run parameters. Experiments demonstrate the significant improvement in the localization accuracy of models in plane, homography and motion segmentation examples. The efficiency of the algorithm is not dependent on fine tuning of run parameters like most others in the field.
Resumo:
The purpose of this article is to investigate in which ways multi-level actor cooperation advances national and local implementation processes of human rights norms in weak-state contexts. Examining the cases of women’s rights in Bosnia and Herzegovina and children’s rights in Bangladesh, we comparatively point to some advantages and disadvantages cooperative relations between international organisations, national governments and local NGOs can entail. Whereas these multi-level actor constellations (MACs) usually initiate norm implementation processes reliably and compensate governmental deficits, they are not always sustainable in the long run. If international organisations withdraw support from temporary missions or policy projects, local NGOs are not able to perpetuate implementation activities if state capacities have not been strengthened by MACs. Our aim is to highlight functions of local agency within multi-level cooperation and to critically raise sustainability issues in human rights implementation to supplement norm research in International Relations.
Resumo:
Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.
Resumo:
This article shows the social importance of subsistence minimum in Georgia. The methodology of its calculation is also shown. We propose ways of improving the calculation of subsistence minimum in Georgia and how to extend it for other developing countries. The weights of food and non-food expenditures in the subsistence minimum baskets are essential in these calculations. Daily consumption value of the minimum food basket has been calculated too. The average consumer expenditures on food supply and the other expenditures to the share are considered in dynamics. Our methodology of the subsistence minimum calculation is applied for the case of Georgia. However, it can be used for similar purposes based on data from other developing countries, where social stability is achieved, and social inequalities are to be actualized. ACM Computing Classification System (1998): H.5.3, J.1, J.4, G.3.
Resumo:
2010 Mathematics Subject Classification: 94A17, 62B10, 62F03.
Resumo:
2000 Mathematics Subject Classification: 53C42, 53C55.
Resumo:
2000 Mathematics Subject Classification: 47A10, 47A12, 47A30, 47B10, 47B20, 47B37, 47B47, 47D50.
Resumo:
In this paper, we first present a simple but effective L1-norm-based two-dimensional principal component analysis (2DPCA). Traditional L2-norm-based least squares criterion is sensitive to outliers, while the newly proposed L1-norm 2DPCA is robust. Experimental results demonstrate its advantages. © 2006 IEEE.
Resumo:
Tensor analysis plays an important role in modern image and vision computing problems. Most of the existing tensor analysis approaches are based on the Frobenius norm, which makes them sensitive to outliers. In this paper, we propose L1-norm-based tensor analysis (TPCA-L1), which is robust to outliers. Experimental results upon face and other datasets demonstrate the advantages of the proposed approach. © 2006 IEEE.
Resumo:
To the extent minimum-wage regulation is effective in fighting against excessive earnings handicaps of those at the lower end-tail of earnings distribution, it may have the side-effect of worsening their employment prospects. A demand-and-supply interpretation of data on the relative employment rate and earnings position of the least educated in the EU27 suggests that the resulting dilemma might be particularly relevant for minimum-wage policies in post-socialist countries.