997 resultados para imperfect quality


Relevância:

100.00% 100.00%

Publicador:

Resumo:

MSC 2010: 26A33, 33E12, 33C60, 44A20

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The accuracy of a map is dependent on the reference dataset used in its construction. Classification analyses used in thematic mapping can, for example, be sensitive to a range of sampling and data quality concerns. With particular focus on the latter, the effects of reference data quality on land cover classifications from airborne thematic mapper data are explored. Variations in sampling intensity and effort are highlighted in a dataset that is widely used in mapping and modelling studies; these may need accounting for in analyses. The quality of the labelling in the reference dataset was also a key variable influencing mapping accuracy. Accuracy varied with the amount and nature of mislabelled training cases with the nature of the effects varying between classifiers. The largest impacts on accuracy occurred when mislabelling involved confusion between similar classes. Accuracy was also typically negatively related to the magnitude of mislabelled cases and the support vector machine (SVM), which has been claimed to be relatively insensitive to training data error, was the most sensitive of the set of classifiers investigated, with overall classification accuracy declining by 8% (significant at 95% level of confidence) with the use of a training set containing 20% mislabelled cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A survey of the quality of salt cured fish in Kanyakumari District, Madras State was done during the years 1963 and 1964 to obtain necessary basic information to formulate quality standards for these products which are gaining importance in the export trade. 155 trade samples of sun-dried, dry-salted, wet-cured and pit-cured fishery products were examined for their chemical quality and organoleptic characteristics. 26.5% of the sun-dried products, 25% of the wet cured fish, 55.21% of the dried salted products and none of the pit cured samples were found to be good in quality. The sun dried products were generally found to have heavy admixture of sand and were inadequately dried. The chief defects in the salt cured fish products were found to be the use of spoiled fish, imperfect cleaning and washing, use of impure salt, inadequate salting, curing and drying, and unhygienic conditions in all stages. Quality standards must be formulated for each variety of salt cured fish product and adequate measures taken to rectify the defects and enforce the quality standards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A quality of survey was conducted at the fish curing yards in a northwest coast and the southern coast in Sri Lanka. A total of 40 samples different varieties of fishes were collected from the market and jaadi curing yards and all were evaluated for the quality, fungal and insect infestation. Samples were analyzed for proximate composition chemical, microbiological and sensory quality. Thirty percent of the total analyzed samples of fish were found to be unfit for consumption. Samples collected from Negombo were found to the infected with maggots. Only 42% samples had dry matter above 50%. All the samples showed a protein content above 20%. The highest protein content was 27.92% in hurulla. Over 90% of the samples had TVN at acceptable quality limits (>40). The TBC for 33% of the samples were in the range 104-105/g range, while 48% were in the range of 107-108/g due to contamination of maggots and fungi. The Survey showed jaadi had a high level of protein in its composition. But defects of curing process such on imperfect cleaning inadequate salting resulted in low (Chemical and microbiological) quality of the product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results of chemical, bacteriological and organoleptic quality studies of cured fishery products of commerce collected from six major fish curing centres on the west coast of India are presented. 77.12% of the samples had moisture above 35%, 97.18% showed salt content below 25% and all samples had acid insoluble ash above 1.5%. 42.32% gave standard plate counts above 10,000 and 45.77% were contaminated with 'Red' halophiles. The major defects in curing were imperfect cleaning, inadequate salting and unhygienic conditions of processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of imperfect boundaries on the mode quality factor is investigated for equilateral-triangle-resonator (ETR) semiconductor microlasers by the finite difference time domain technique and the Pade approximation with Baker's algorithm. For 2-D ETR with a refractive index of 3.2 and side length of 5 mum, the confined modes can still have a quality factor of about 1000 as small triangles with side length of 1 mum are cut from the vertices of the ETR. For a deformed 5 mum ETR with round vertices and curve sides, the simulated mode quality factors are comparable to the measured results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed massive multiple-input multiple-output (MIMO) combines the array gain of coherent MIMO processing with the proximity gains of distributed antenna setups. In this paper, we analyze how transceiver hardware impairments affect the downlink with maximum ratio transmission. We derive closed-form spectral efficiencies expressions and study their asymptotic behavior as the number of the antennas increases. We prove a scaling law on the hardware quality, which reveals that massive MIMO is resilient to additive distortions, while multiplicative phase noise is a limiting factor. It is also better to have separate oscillators at each antenna than one per BS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sectoral policies make explicit and implicit assumptions about the behaviour and capabilities of the agents (such as dynamic responses to market signals, demand-led assistance, collaborative efforts, participation in financing); which we consider to be rather unrealistic. Because of this lack of realism, policies that aim to be neutral often turn out to be highly exclusive. They fail to give sufficient importance to the special features of the sector -with its high climatic, biological and commercial risks and its slow adaptation- or to the fact that those who take decisions in agriculture are now mostly in an inferior position because of their incomes below the poverty line, their inadequate training, their traditions based on centuries of living in precarious conditions, and their geographical location in marginal areas, far from infrastructure and with only a minimum of services and sources of information. These people have only scanty and imperfect access to the markets which, according to the prevailing model, should govern decisions and the (re);distribution of the factors of production. In our opinion, this explains the patchy and lower-than-expected growth registered by the sector after the reforms to promote the liberalization of markets and external openness in the region. In view of the results of the application of the new model, it may be wondered whether Latin America can afford a form of development which excludes over half of its agricultural producers; what the alternatives are; and what costs and benefits each of them offers in terms of production and monetary, social, spatial and other aspects. The article outlines the changes in policies and their results at the aggregate level, summarizes the arguments usually put forward to explain agricultural performance in the region, and proposes a second set of explanations based on a description of the agents and the responses that may be expected from them, contrasting the latter with the supposedly neutral nature of the policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes "as good as new" after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper argues (following Gould, 2003) that the disappearance of the .400 hitter in major league baseball is due, not to a decrease in ability at the top end of the talent distribution, but to better methods of screening out players at the low end of the distribution. The argument is related to the economic literature on minimum quality standards in markets with imperfect information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional Text-To-Speech (TTS) systems have been developed using especially-designed non-expressive scripted recordings. In order to develop a new generation of expressive TTS systems in the Simple4All project, real recordings from the media should be used for training new voices with a whole new range of speaking styles. However, for processing this more spontaneous material, the new systems must be able to deal with imperfect data (multi-speaker recordings, background and foreground music and noise), filtering out low-quality audio segments and creating mono-speaker clusters. In this paper we compare several architectures for combining speaker diarization and music and noise detection which improve the precision and overall quality of the segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most recently developed context-aware software applicationsmake unrealistic assumptions about the quality of the available context information, which can lead to inappropriate actions by the application and frustration on the part of the user. In this paper, we explore the problem of imperfect context information and some of its causes, and propose a novel approach for modelling incomplete and inaccurate information. Additionally, we present a discussion of our experiences in developing a contextaware communication application, highlighting design issues that are pertinent when developing applications that rely on imperfect context information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ideas about the evolution of imperfect mimicry are reviewed. Their relevance to the colours patterns of hoverflies (Diptera, Syrphidae) are discussed in detail. Most if not all of the hoverflies labelled as mimetic actually are mimics. The apparently poor nature of their resemblance does not prevent them from obtaining at least some protection from suitably experienced birds. Mimicry is a dominant theme of this very large family of Diptera, with at least a quarter of all species in Europe being mimetic. Hoverfly mimics fall into three major groups according to their models, involving bumblebees, honeybees and social wasps. There are striking differences in the general levels of mimetic fidelity and relative abundances of the three groups, with accurate mimicry, low abundance and polymorphism characterizing the bumblebee mimics: more than half of all the species of bumblebee mimics are polymorphic. Mimics of social wasps tend to be poor mimics, have high relative abundance, and polymorphism is completely absent. Bumblebee models fall into a small number of Muellerian mimicry rings which are very different between the Palaearctic and Nearctic regions. Social wasps and associated models form one large Muellerian complex. Together with honeybees, these complexes probably form real clusters of forms as perceived by many birds. All three groups of syrphid mimics contain both good and poor mimics; some mimics are remarkably accurate, and have close morphological and behavioural resemblance. At least some apparently 'poor' mimetic resemblances may be much closer in birds' perception than we imagine, and more work needs to be done on this. Bumblebees are the least noxious and wasps the most noxious of the three main model groups. The basis of noxiousness is different, with bumblebees being classified as non-food, whereas honeybees and wasps are nasty-tasting and (rarely) stinging. The distribution of mimicry is exactly what would be expected from this ordering, with polymorphic and accurate forms being a key feature of mimics of the least noxious models, while highly noxious models have poor-quality mimicry. Even if the high abundance of many syrphid mimics relative to their models is a recent artefact of man-made environmental change, this does not preclude these species from being mimics. It seems unlikely that bird predation actually controls the populations of adult syrphids. Being rare relative to a model may have promoted or accelerated the evolution of perfect mimicry: theoretically this might account for the pattern of rare good mimics and abundant poor ones, but the idea is intrinsically unlikely. Many mimics seem to have hour-to-hour abundances related to those of their models, presumably as a result of behavioural convergence. We need to know much more about the psychology of birds as predators. There are at least four processes that need elucidating: (a) learning about the noxiousness of models; (b) the erasing of that learning through contact with mimics (extinction, or learned forgetting); (c) forgetting; (d) deliberate risk-taking and the physiological states that promote it. Johnston's (2002) model of the stabilization of imperfect mimicry by kin selection is unlikely to account for the colour patterns of hoverflies. Sherratt's (2002) model of the influence of multiple models potentially accounts for all the patterns of hoverfly mimicry, and is the most promising avenue for testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding how imperfect information affects firms' investment decision helps answer important questions in economics, such as how we may better measure economic uncertainty; how firms' forecasts would affect their decision-making when their beliefs are not backed by economic fundamentals; and how important are the business cycle impacts of changes in firms' productivity uncertainty in an environment of incomplete information. This dissertation provides a synthetic answer to all these questions, both empirically and theoretically. The first chapter, provides empirical evidence to demonstrate that survey-based forecast dispersion identifies a distinctive type of second moment shocks different from the canonical volatility shocks to productivity, i.e. uncertainty shocks. Such forecast disagreement disturbances can affect the distribution of firm-level beliefs regardless of whether or not belief changes are backed by changes in economic fundamentals. At the aggregate level, innovations that increase the dispersion of firms' forecasts lead to persistent declines in aggregate investment and output, which are followed by a slow recovery. On the contrary, the larger dispersion of future firm-specific productivity innovations, the standard way to measure economic uncertainty, delivers the ``wait and see" effect, such that aggregate investment experiences a sharp decline, followed by a quick rebound, and then overshoots. At the firm level, data uncovers that more productive firms increase investments given rises in productivity dispersion for the future, whereas investments drop when firms disagree more about the well-being of their future business conditions. These findings challenge the view that the dispersion of the firms' heterogeneous beliefs captures the concept of economic uncertainty, defined by a model of uncertainty shocks. The second chapter presents a general equilibrium model of heterogeneous firms subject to the real productivity uncertainty shocks and informational disagreement shocks. As firms cannot perfectly disentangle aggregate from idiosyncratic productivity because of imperfect information, information quality thus drives the wedge of difference between the unobserved productivity fundamentals, and the firms' beliefs about how productive they are. Distribution of the firms' beliefs is no longer perfectly aligned with the distribution of firm-level productivity across firms. This model not only explains why, at the macro and micro level, disagreement shocks are different from uncertainty shocks, as documented in Chapter 1, but helps reconcile a key challenge faced by the standard framework to study economic uncertainty: a trade-off between sizable business cycle effects due to changes in uncertainty, and the right amount of pro-cyclicality of firm-level investment rate dispersion, as measured by its correlation with the output cycles.