969 resultados para Quantization Errors
Resumo:
In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.
Resumo:
This article reports on a lossless data hiding scheme for digital images where the data hiding capacity is either determined by minimum acceptable subjective quality or by the demanded capacity. In the proposed method data is hidden within the image prediction errors, where the most well-known prediction algorithms such as the median edge detector (MED), gradient adjacent prediction (GAP) and Jiang prediction are tested for this purpose. In this method, first the histogram of the prediction errors of images are computed and then based on the required capacity or desired image quality, the prediction error values of frequencies larger than this capacity are shifted. The empty space created by such a shift is used for embedding the data. Experimental results show distinct superiority of the image prediction error histogram over the conventional image histogram itself, due to much narrower spectrum of the former over the latter. We have also devised an adaptive method for hiding data, where subjective quality is traded for data hiding capacity. Here the positive and negative error values are chosen such that the sum of their frequencies on the histogram is just above the given capacity or above a certain quality.
Resumo:
In a system where tens of thousands of words are made up of a limited number of phonemes, many words are bound to sound alike. This similarity of the words in the lexicon as characterized by phonological neighbourhood density (PhND) has been shown to affect speed and accuracy of word comprehension and production. Whereas there is a consensus about the interfering nature of neighbourhood effects in comprehension, the language production literature offers a more contradictory picture with mainly facilitatory but also interfering effects reported on word production. Here we report both of these two types of effects in the same study. Multiple regression mixed models analyses were conducted on PhND effects on errors produced in a naming task by a group of 21 participants with aphasia. These participants produced more formal errors (interfering effect) for words in dense phonological neighbourhoods, but produced fewer nonwords and semantic errors (a facilitatory effect) with increasing density. In order to investigate the nature of these opposite effects of PhND, we further analysed a subset of formal errors and nonword errors by distinguishing errors differing on a single phoneme from the target (corresponding to the definition of phonological neighbours) from those differing on two or more phonemes. This analysis confirmed that only formal errors that were phonological neighbours of the target increased in dense neighbourhoods, while all other errors decreased. Based on additional observations favouring a lexical origin of these formal errors (they exceeded the probability of producing a real-word error by chance, were of a higher frequency, and preserved the grammatical category of the targets), we suggest that the interfering effect of PhND is due to competition between lexical neighbours and target words in dense neighbourhoods.
Resumo:
The purpose of this bachelor's thesis was to chart scientific research articles to present contributing factors to medication errors done by nurses in a hospital setting, and introduce methods to prevent medication errors. Additionally, international and Finnish research was combined and findings were reflected in relation to the Finnish health care system. Literature review was conducted out of 23 scientific articles. Data was searched systematically from CINAHL, MEDIC and MEDLINE databases, and also manually. Literature was analysed and the findings combined using inductive content analysis. Findings revealed that both organisational and individual factors contributed to medication errors. High workload, communication breakdowns, unsuitable working environment, distractions and interruptions, and similar medication products were identified as organisational factors. Individual factors included nurses' inability to follow protocol, inadequate knowledge of medications and personal qualities of the nurse. Developing and improving the physical environment, error reporting, and medication management protocols were emphasised as methods to prevent medication errors. Investing to the staff's competence and well-being was also identified as a prevention method. The number of Finnish articles was small, and therefore the applicability of the findings to Finland is difficult to assess. However, the findings seem to fit to the Finnish health care system relatively well. Further research is needed to identify those factors that contribute to medication errors in Finland. This is a necessity for the development of methods to prevent medication errors that fit in to the Finnish health care system.
Resumo:
This study is an empirical analysis of the impact of direct tax revenue budgeting errors on fiscal deficits. Using panel data from 26 Swiss cantons between 1980 and 2002, we estimate a single equation model on the fiscal balance, as well as a simultaneous equation model on revenue and expenditure. We use new data on budgeted and actual tax revenue to show that underestimating tax revenue significantly reduces fiscal deficits. Furthermore, we show that this effect is channeled through decreased expenditure. The effects of over and underestimation turn out to be symmetric.
Resumo:
The market place of the twenty-first century will demand that manufacturing assumes a crucial role in a new competitive field. Two potential resources in the area of manufacturing are advanced manufacturing technology (AMT) and empowered employees. Surveys in Finland have shown the need to invest in the new AMT in the Finnish sheet metal industry in the 1990's. In this run the focus has been on hard technology and less attention is paid to the utilization of human resources. In manymanufacturing companies an appreciable portion of the profit within reach is wasted due to poor quality of planning and workmanship. The production flow production error distribution of the sheet metal part based constructions is inspectedin this thesis. The objective of the thesis is to analyze the origins of production errors in the production flow of sheet metal based constructions. Also the employee empowerment is investigated in theory and the meaning of the employee empowerment in reducing the overall production error amount is discussed in this thesis. This study is most relevant to the sheet metal part fabricating industrywhich produces sheet metal part based constructions for electronics and telecommunication industry. This study concentrates on the manufacturing function of a company and is based on a field study carried out in five Finnish case factories. In each studied case factory the most delicate work phases for production errors were detected. It can be assumed that most of the production errors are caused in manually operated work phases and in mass production work phases. However, no common theme in collected production error data for production error distribution in the production flow can be found. Most important finding was still that most of the production errors in each case factory studied belong to the 'human activity based errors-category'. This result indicates that most of the problemsin the production flow are related to employees or work organization. Development activities must therefore be focused to the development of employee skills orto the development of work organization. Employee empowerment gives the right tools and methods to achieve this.
Resumo:
The questions studied in this thesis are centered around the moment operators of a quantum observable, the latter being represented by a normalized positive operator measure. The moment operators of an observable are physically relevant, in the sense that these operators give, as averages, the moments of the outcome statistics for the measurement of the observable. The main questions under consideration in this work arise from the fact that, unlike a projection valued observable of the von Neumann formulation, a general positive operator measure cannot be characterized by its first moment operator. The possibility of characterizing certain observables by also involving higher moment operators is investigated and utilized in three different cases: a characterization of projection valued measures among all the observables is given, a quantization scheme for unbounded classical variables using translation covariant phase space operator measures is presented, and, finally, a mathematically rigorous description is obtained for the measurements of rotated quadratures and phase space observables via the high amplitude limit in the balanced homodyne and eight-port homodyne detectors, respectively. In addition, the structure of the covariant phase space operator measures, which is essential for the above quantization, is analyzed in detail in the context of a (not necessarily unimodular) locally compact group as the phase space.
Resumo:
Clinic simulation as a training and knowledge method allows people experiment a real event representation with the aim of acquiring knowledge, abilities and aptitudes. The filming of the staging represents a useful tool to review the decisions taken and the actions they did, with the purpose to highlight the strengths, weaknesses and the areas for improvement. The article describes a study carried out by a group of students in second course of nursing, and it tries to evaluate if there is any influence if somebody is filming you during the clinic simulation, does it make you do more errors or not?
Resumo:
Les erreurs innées du métabolisme (EIM) sont dues à des mutations de gènes codant pour des enzymes du métabolisme et sont classées selon trois grands groupes de maladies: 1) intoxications; 2) déficit énergétique et 3) déficit de synthèse ou catabolisme des maladies complexes. Le progrès thérapeutique des vingt dernières années a permis d'améliorer le pronostic des enfants atteints d'EIM. Ces enfants grandissent et doivent être pris en charge à l'adolescence et à l'âge adulte par des équipes spécialisées. Cette médecine métabolique pour adultes est une discipline relativement nouvelle avec une information limitée chez l'adulte. Les recommandations pédiatriques sont extrapolées à la prise en charge des adultes tout en intégrant les différentes étapes de vie (indépendance sociale, grossesse, vieillissement et éventuelles complications tardives). Inborn errors of metabolism (IEM) are due to mutations of genes coding for enzymes of intermediary metabolism and are classified into 3 broad categories: 1) intoxication, 2) energy defect and 3) cellular organelles synthesis or catabolism defect. Improvements of therapy over these last 20 years has improved prognosis of children with IEM. These children grow up and should have their transition to specialized adult care. Adult patients with IEM are a relatively new phenomenon with currently only limited knowledge. Extrapolated pediatric guidelines are applied to the adult population taking into account adult life stages (social independence, pregnancy, aging process and potential long-term complications).
Resumo:
In humans, action errors and perceptual novelty elicit activity in a shared frontostriatal brain network, allowing them to adapt their ongoing behavior to such unexpected action outcomes. Healthy and pathologic aging reduces the integrity of white matter pathways that connect individual hubs of such networks and can impair the associated cognitive functions. Here, we investigated whether structural disconnection within this network because of small-vessel disease impairs the neural processes that subserve motor slowing after errors and novelty (post-error slowing, PES; post-novel slowing, PNS). Participants with intact frontostriatal circuitry showed increased right-lateralized beta-band (12-24 Hz) synchrony between frontocentral and frontolateral electrode sites in the electroencephalogram after errors and novelty, indexing increased neural communication. Importantly, this synchrony correlated with PES and PNS across participants. Furthermore, such synchrony was reduced in participants with frontostriatal white matter damage, in line with reduced PES and PNS. The results demonstrate that behavioral change after errors and novelty result from coordinated neural activity across a frontostriatal brain network and that such cognitive control is impaired by reduced white matter integrity.
Resumo:
Abstract Objective: To evaluate three-dimensional translational setup errors and residual errors in image-guided radiosurgery, comparing frameless and frame-based techniques, using an anthropomorphic phantom. Materials and Methods: We initially used specific phantoms for the calibration and quality control of the image-guided system. For the hidden target test, we used an Alderson Radiation Therapy (ART)-210 anthropomorphic head phantom, into which we inserted four 5mm metal balls to simulate target treatment volumes. Computed tomography images were the taken with the head phantom properly positioned for frameless and frame-based radiosurgery. Results: For the frameless technique, the mean error magnitude was 0.22 ± 0.04 mm for setup errors and 0.14 ± 0.02 mm for residual errors, the combined uncertainty being 0.28 mm and 0.16 mm, respectively. For the frame-based technique, the mean error magnitude was 0.73 ± 0.14 mm for setup errors and 0.31 ± 0.04 mm for residual errors, the combined uncertainty being 1.15 mm and 0.63 mm, respectively. Conclusion: The mean values, standard deviations, and combined uncertainties showed no evidence of a significant differences between the two techniques when the head phantom ART-210 was used.
Resumo:
Different methods to determine total fat (TF) and fatty acids (FA), including trans fatty acids (TFA), in diverse foodstuffs were evaluated, incorporating gravimetric methods and gas chromatography with flame ionization detector (GC/FID), in accordance with a modified AOAC 996.06 method. Concentrations of TF and FA obtained through these different procedures diverged (p< 0.05) and TFA concentrations varied beyond 20 % of the reference values. The modified AOAC 996.06 method satisfied both accuracy and precision, was fast and employed small amounts of low toxicity solvents. Therefore, the results showed that this methodology is viable to be adopted in Brazil for nutritional labeling purposes.
Resumo:
Analytical curves are normally obtained from discrete data by least squares regression. The least squares regression of data involving significant error in both x and y values should not be implemented by ordinary least squares (OLS). In this work, the use of orthogonal distance regression (ODR) is discussed as an alternative approach in order to take into account the error in the x variable. Four examples are presented to illustrate deviation between the results from both regression methods. The examples studied show that, in some situations, ODR coefficients must substitute for those of OLS, and, in other situations, the difference is not significant.