8 resultados para false
em Dalarna University College Electronic Archive
Resumo:
When using the digital halftone proofing systems, a closer print match can be achieved compared to what earlier couldbe done with the analogue proofing systems. These proofing systems possibilities to produce accurate print match canas well lead to producing bad print matches as several print related parameters can be adjusted manually in the systemby the user. Therefore, more advanced knowledge in graphic arts technology is required by the user of the system.The prepress company Colorcraft AB wishes to control that their color proofs always have the right quality. This projectwas started with the purpose to find a quality control metod for Colorcraft´s digital halftone proofing system(Kodak Approval XP4).Using a software who supports spectral measuring combined with a spectrophotometer and a control bar, a qualitycontrol system was assembled. This system detects variations that lies out of the proofing system´s natural deviation.The prerequisite for this quality control system is that the tolerances are defined with consideration taken to the proofingsystems natural deviations. Othervise the quality control system will generate unnecessecary false alarms and thereforenot be reliable.
Resumo:
The aim of this study is to find similarities and differences between male and female fiction-writing. The data has been collected from pupils at an upper secondary school in Central Sweden. They were given an extract from a novel by Bernard MacLaverty and from that they were supposed to continue the story.Theories that have evolved during the last centuries claim that the language use between men and women differ in many aspects. Women, it is said, use a more emotional language than men do, while men use more expletives than women. Likewise, the language is said to differ in the use of adverbs, verbs and adjectives. It has also been claimed that men and women have different topic developments and that women write longer sentences than men.The results of the current study show that most of these claims are false, or at least not true in this specific context. In most cases there is little or no difference between the male writing and the female writing. There are also cases where the opposite is true – for example, the female participants write shorter sentences than the male participants. A general conclusion of the study is that the writing between the two groups are quite similar – or at least that similarities are present to a larger extent than differences.
Resumo:
In this thesis, a new algorithm has been proposed to segment the foreground of the fingerprint from the image under consideration. The algorithm uses three features, mean, variance and coherence. Based on these features, a rule system is built to help the algorithm to efficiently segment the image. In addition, the proposed algorithm combine split and merge with modified Otsu. Both enhancements techniques such as Gaussian filter and histogram equalization are applied to enhance and improve the quality of the image. Finally, a post processing technique is implemented to counter the undesirable effect in the segmented image. Fingerprint recognition system is one of the oldest recognition systems in biometrics techniques. Everyone have a unique and unchangeable fingerprint. Based on this uniqueness and distinctness, fingerprint identification has been used in many applications for a long period. A fingerprint image is a pattern which consists of two regions, foreground and background. The foreground contains all important information needed in the automatic fingerprint recognition systems. However, the background is a noisy region that contributes to the extraction of false minutiae in the system. To avoid the extraction of false minutiae, there are many steps which should be followed such as preprocessing and enhancement. One of these steps is the transformation of the fingerprint image from gray-scale image to black and white image. This transformation is called segmentation or binarization. The aim for fingerprint segmentation is to separate the foreground from the background. Due to the nature of fingerprint image, the segmentation becomes an important and challenging task. The proposed algorithm is applied on FVC2000 database. Manual examinations from human experts show that the proposed algorithm provides an efficient segmentation results. These improved results are demonstrating in diverse experiments.
Resumo:
This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.
Resumo:
Defects are often present in rolled products, such as wire rod. The markets demand for wire rod without any defects has increased. In the final wire rod products, defects originating from the steel making, casting, pre-rolling of billets and during the wire rod rolling can appear. In this work, artificial V-shaped longitudinal surface cracks has been analysed experimentally and by means of FEM. The results indicate that the experiments and FEM calculations show the same tendency except in two cases, where instability due to a fairly “round” false round bars disturbed the experiment. FE studies in combination with practical experiments are necessary in order to understand the behaviour of the material flows in the groove and to explain whether the crack will open up as a V-shape or if it will be closed as an I-shape.
Resumo:
Negotiating experience in the court How do judges assess witness credibility, and how do they proceed to reach sustainable conclusions in a criminal court? This article is based on discussions in four focus groups with lay judges in Swedish district courts. In criminal court trials, a version of an event is generally reinforced if it is confirmed by witnesses. However, if their narratives are too similar, none of them is found trustworthy. The focus group participants agreed that if witnesses were suspected of having discussed their individual experiences of an event and accommodated them into a common story, their testimonies were not considered credible. While testimonies should ideally be untainted by other people’s impressions and opinions, other rules govern the truth of the court. The lay judges appreciated their deliberations, including negotiations on impressions and memories of the trial, and they sometimes adjusted their perceptions in the light of information provided by other members of the court. However, if the lay judges are viewed as witnesses of what takes place in the trial, this gives rise to a paradox: While witness negotiations on experiences are regarded as a means to construct a false or biased story, the same kind of interaction between the judges is considered necessary to establish a consensual truth of what actually happened.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.