925 resultados para Graphic consistency
Resumo:
The ultimate goal of profiling is to identify the major behavioral and personality characteristics to narrow the suspect pool. Inferences about offender characteristics can be accomplished deductively, based on the analysis of discrete offender behaviors established within a particular case. They can also be accomplished inductively, involving prediction based on abstract offender averages from group data (these methods and the logic on which they are based is detailed extensively in Chapters 2 and 4). As discussed, these two approaches are by no means equal.
Resumo:
Criminal profiling is an investigative tool used around the world to infer the personality and behavioural characteristics of an offender based on their crime. Case linkage, the process of determining discreet connections between crimes of the same offender, is a practice that falls under the general banner of criminal profiling and has been widely criticized. Two theories, behavioural consistency and the homology assumption, are examined and their impact on profiling in general and case linkage specifically is discussed...
Resumo:
Given that there is increasing recognition of the effect that submillimetre changes in collimator position can have on radiotherapy beam dosimetry, this study aimed to evaluate the potential variability in small field collimation that may exist between otherwise matched linacs. Field sizes and field output factors were measured using radiochromic film and an electron diode, for jaw- and MLC-collimated fields produced by eight dosimetrically matched Varian iX linacs (Varian Medical Systems, Palo Alto, USA). This study used nominal sizes from 0.6×0.6 to 10×10 cm215 , for jaw-collimated fields,and from 1×1 to 10×10 cm216 , for MLC-collimated fields, delivered from a zero (head up, beam directed vertically downward) gantry angle. Differences between the field sizes measured for the eight linacs exceeded the uncertainty of the film measurements and the repositioning uncertainty of the jaws and MLCs on one linac. The dimensions of fields defined by MLC leaves were more consistent between linacs, while also differing more from their nominal values than fields defined by orthogonal jaws. The field output factors measured for the different linacs generally increased with increasing measured field size for the nominal 0.6×0.6 and 1×1 cm2 fields, and became consistent between linacs for nominal field sizes of 2×2 cm2 25 and larger. The inclusion in radiotherapy treatment planning system beam data of small field output factors acquired in fields collimated by jaws (rather than the more-reproducible MLCs), associated with either the nominal or the measured field sizes, should be viewed with caution. The size and reproducibility of the fields (especially the small fields) used to acquire treatment planning data should be investigated thoroughly as part of the linac or planning system commissioning process. Further investigation of these issues, using different linac models, collimation systems and beam orientations, is recommended.
Resumo:
Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.
Resumo:
We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.
Resumo:
We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.
Resumo:
Friction characteristics of journal bearings made from cast graphic aluminum particulate composite alloy were determined under mixed lubrication and compared with those of the base alloy (without graphite) and leaded phosphor bronze. All three materials ran without seizure while the performance of the particulate composite and leaded phosphor bronze improved with running. Temperature rise in the journal bearing under mixed/boundary lubrication was also measured. It was found that with 0.3D/1000 to 1.5D/1000 clearance and a low lubrication rate (typical value for a bearing of diameter 35 mm × length 35 mm is 80 mm3/min) and at a PV value of 73 × 106 Nm m−2 min−1 graphitic aluminium alloy journal bearings operate satisfactorily without seizure and excessive temperature rise. In comparison, the bronze bearings, with all the other parameters remaining the same, could not run without excessive temperature rise at clearances below D/1000 at lubrication rates lower than 200 mm3/min
Resumo:
BACKGROUND Control of pests in stored grain and the evolution of resistance to pesticides are serious problems worldwide. A stochastic individual-based two-locus model was used to investigate the impact of two important issues, the consistency of pesticide dosage through the storage facility and the immigration rate of the adult pest, on overall population control and avoidance of evolution of resistance to the fumigant phosphine in an important pest of stored grain, the lesser grain borer. RESULTS A very consistent dosage maintained good control for all immigration rates, while an inconsistent dosage failed to maintain control in all cases. At intermediate dosage consistency, immigration rate became a critical factor in whether control was maintained or resistance emerged. CONCLUSION Achieving a consistent fumigant dosage is a key factor in avoiding evolution of resistance to phosphine and maintaining control of populations of stored-grain pests; when the dosage achieved is very inconsistent, there is likely to be a problem regardless of immigration rate. © 2012 Society of Chemical Industry
Resumo:
Purpose – This paper aims to go beyond a bookkeeping approach to evolutionary analysis whereby surviving firms are better adapted and extinct firms were less adapted. From discussion of the preliminary findings of research into the Hobart pizza industry, evidence is presented of the need to adopt a more traditional approach to applying evolutionary theories with organizational research. Design/methodology/approach – After a brief review of the relevant literature, the preliminary findings of research into the Hobart pizza industry are presented. Then, several evolutionary concepts that are commonplace in ecological research are introduced to help explain the emergent findings. The paper concludes with consideration given to advancing a more consistent approach to employing evolutionary theories within organizational research. Findings – The paper finds that the process of selection cannot be assumed to occur evenly across time and/or space. Within geographically small markets different forms of selection operate in different ways and degrees requiring the use of more traditional evolutionary theories to highlight the causal process associated with population change. Research limitations/implications – The paper concludes by highlighting Geoffrey Hodgson’s Principle of Consistency. It is demonstrated that a failure to truly understand how and why theory is used in one domain will likely result in its misuse in another domain. That, at present, too few evolutionary concepts are employed in organisational research to ensure an appreciation of any underlying causal processes through which social change occurs. Originality/value – The concepts introduced throughout this paper, whilst not new, provide new entry points for organizational researchers intent on employing an evolutionary approach to understand the process of social change.
Resumo:
A ternary thermodynamic function has been developed based on statistico-thermodynamic considerations, with a particular emphasis on the higher-order terms indicating the effects of truncation at the various stages of the treatment. Although the truncation of a series involved in the equation introduces inconsistency, the latter may be removed by imposing various thermodynamic boundary conditions. These conditions are discussed in the paper. The present equation with higher-order terms shows that the α function of a component reduces to a quadratic function of composition at constant compositional paths involving the other two components in the system. The form of the function has been found to be representative of various experimental observations.
Resumo:
After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939. This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.
Resumo:
Self-similarity, a concept taken from mathematics, is gradually becoming a keyword in musicology. Although a polysemic term, self-similarity often refers to the multi-scalar feature repetition in a set of relationships, and it is commonly valued as an indication for musical coherence and consistency . This investigation provides a theory of musical meaning formation in the context of intersemiosis, that is, the translation of meaning from one cognitive domain to another cognitive domain (e.g. from mathematics to music, or to speech or graphic forms). From this perspective, the degree of coherence of a musical system relies on a synecdochic intersemiosis: a system of related signs within other comparable and correlated systems. This research analyzes the modalities of such correlations, exploring their general and particular traits, and their operational bounds. Looking forward in this direction, the notion of analogy is used as a rich concept through its two definitions quoted by the Classical literature: proportion and paradigm, enormously valuable in establishing measurement, likeness and affinity criteria. Using quantitative qualitative methods, evidence is presented to justify a parallel study of different modalities of musical self-similarity. For this purpose, original arguments by Benoît B. Mandelbrot are revised, alongside a systematic critique of the literature on the subject. Furthermore, connecting Charles S. Peirce s synechism with Mandelbrot s fractality is one of the main developments of the present study. This study provides elements for explaining Bolognesi s (1983) conjecture, that states that the most primitive, intuitive and basic musical device is self-reference, extending its functions and operations to self-similar surfaces. In this sense, this research suggests that, with various modalities of self-similarity, synecdochic intersemiosis acts as system of systems in coordination with greater or lesser development of structural consistency, and with a greater or lesser contextual dependence.
Resumo:
The apparent contradiction between the exact nature of the interaction parameter formalism as presented by Lupis and Elliott and the inconsistencies discussed recently by Pelton and Bale arise from the truncation of the Maclaurin series in the latter treatment. The truncation removes the exactness of the expression for the logarithm of the activity coefficient of a solute in a multi-component system. The integrals are therefore path dependent. Formulae for integration along paths of constant Xi,or X i/Xj are presented. The expression for In γsolvent given by Pelton and Bale is valid only in the limit that the mole fraction of solvent tends to one. The truncation also destroys the general relations between interaction parameters derived by Lupis and Elliott. For each specific choice of parameters special relationships are obtained between interaction parameters.
Resumo:
Three different types of consistencies, viz., semiweak, weak, and strong, of a read-only transaction in a schedule s of a set T of transactions are defined and these are compared with the existing notions of consistencies of a read-only transaction in a schedule. We present a technique that enables a user to control the consistency of a read-only transaction in heterogeneous locking protocols. Since the weak consistency of a read-only transaction improves concurrency in heterogeneous locking protocols, the users can help to improve concurrency in heterogeneous locking protocols by supplying the consistency requirements of read-only transactions. A heterogeneous locking protocol P' derived from a locking protocol P that uses exclusive mode locks only and ensures serializability need not be deadlock-free. We present a sufficient condition that ensures the deadlock-freeness of Pprime, when P is deadlock-free and all the read-only transactions in Pprime are two phase.