972 resultados para cache consistency


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Until quite recently, most Australian jurisdictions gave statutory force to the principle of imprisonment as a sanction of last resort, reflecting its status as the most punitive sentencing option open to the court.1 That principle gave primary discretion as to whether incarceration was the most appropriate means of achieving the purpose of a sentence to the sentencing court, which received all of the information relevant to the offence, the offender and any victim(s). The disestablishment of this principle is symptomatic of an increasing erosion of judicial discretion with respect to sentencing, which appears to be resulting in some extremely punitive consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ultimate goal of profiling is to identify the major behavioral and personality characteristics to narrow the suspect pool. Inferences about offender characteristics can be accomplished deductively, based on the analysis of discrete offender behaviors established within a particular case. They can also be accomplished inductively, involving prediction based on abstract offender averages from group data (these methods and the logic on which they are based is detailed extensively in Chapters 2 and 4). As discussed, these two approaches are by no means equal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Criminal profiling is an investigative tool used around the world to infer the personality and behavioural characteristics of an offender based on their crime. Case linkage, the process of determining discreet connections between crimes of the same offender, is a practice that falls under the general banner of criminal profiling and has been widely criticized. Two theories, behavioural consistency and the homology assumption, are examined and their impact on profiling in general and case linkage specifically is discussed...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given that there is increasing recognition of the effect that submillimetre changes in collimator position can have on radiotherapy beam dosimetry, this study aimed to evaluate the potential variability in small field collimation that may exist between otherwise matched linacs. Field sizes and field output factors were measured using radiochromic film and an electron diode, for jaw- and MLC-collimated fields produced by eight dosimetrically matched Varian iX linacs (Varian Medical Systems, Palo Alto, USA). This study used nominal sizes from 0.6×0.6 to 10×10 cm215 , for jaw-collimated fields,and from 1×1 to 10×10 cm216 , for MLC-collimated fields, delivered from a zero (head up, beam directed vertically downward) gantry angle. Differences between the field sizes measured for the eight linacs exceeded the uncertainty of the film measurements and the repositioning uncertainty of the jaws and MLCs on one linac. The dimensions of fields defined by MLC leaves were more consistent between linacs, while also differing more from their nominal values than fields defined by orthogonal jaws. The field output factors measured for the different linacs generally increased with increasing measured field size for the nominal 0.6×0.6 and 1×1 cm2 fields, and became consistent between linacs for nominal field sizes of 2×2 cm2 25 and larger. The inclusion in radiotherapy treatment planning system beam data of small field output factors acquired in fields collimated by jaws (rather than the more-reproducible MLCs), associated with either the nominal or the measured field sizes, should be viewed with caution. The size and reproducibility of the fields (especially the small fields) used to acquire treatment planning data should be investigated thoroughly as part of the linac or planning system commissioning process. Further investigation of these issues, using different linac models, collimation systems and beam orientations, is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Packet forwarding is a memory-intensive application requiring multiple accesses through a trie structure. With the requirement to process packets at line rates, high-performance routers need to forward millions of packets every second with each packet needing up to seven memory accesses. Earlier work shows that a single cache for the nodes of a trie can reduce the number of external memory accesses. It is observed that the locality characteristics of the level-one nodes of a trie are significantly different from those of lower level nodes. Hence, we propose a heterogeneously segmented cache architecture (HSCA) which uses separate caches for level-one and lower level nodes, each with carefully chosen sizes. Besides reducing misses, segmenting the cache allows us to focus on optimizing the more frequently accessed level-one node segment. We find that due to the nonuniform distribution of nodes among cache sets, the level-one nodes cache is susceptible t high conflict misses. We reduce conflict misses by introducing a novel two-level mapping-based cache placement framework. We also propose an elegant way to fit the modified placement function into the cache organization with minimal increase in access time. Further, we propose an attribute preserving trace generation methodology which emulates real traces and can generate traces with varying locality. Performanc results reveal that our HSCA scheme results in a 32 percent speedup in average memory access time over a unified nodes cache. Also, HSC outperforms IHARC, a cache for lookup results, with as high as a 10-fold speedup in average memory access time. Two-level mappin further enhances the performance of the base HSCA by up to 13 percent leading to an overall improvement of up to 40 percent over the unified scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Control of pests in stored grain and the evolution of resistance to pesticides are serious problems worldwide. A stochastic individual-based two-locus model was used to investigate the impact of two important issues, the consistency of pesticide dosage through the storage facility and the immigration rate of the adult pest, on overall population control and avoidance of evolution of resistance to the fumigant phosphine in an important pest of stored grain, the lesser grain borer. RESULTS A very consistent dosage maintained good control for all immigration rates, while an inconsistent dosage failed to maintain control in all cases. At intermediate dosage consistency, immigration rate became a critical factor in whether control was maintained or resistance emerged. CONCLUSION Achieving a consistent fumigant dosage is a key factor in avoiding evolution of resistance to phosphine and maintaining control of populations of stored-grain pests; when the dosage achieved is very inconsistent, there is likely to be a problem regardless of immigration rate. © 2012 Society of Chemical Industry

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – This paper aims to go beyond a bookkeeping approach to evolutionary analysis whereby surviving firms are better adapted and extinct firms were less adapted. From discussion of the preliminary findings of research into the Hobart pizza industry, evidence is presented of the need to adopt a more traditional approach to applying evolutionary theories with organizational research. Design/methodology/approach – After a brief review of the relevant literature, the preliminary findings of research into the Hobart pizza industry are presented. Then, several evolutionary concepts that are commonplace in ecological research are introduced to help explain the emergent findings. The paper concludes with consideration given to advancing a more consistent approach to employing evolutionary theories within organizational research. Findings – The paper finds that the process of selection cannot be assumed to occur evenly across time and/or space. Within geographically small markets different forms of selection operate in different ways and degrees requiring the use of more traditional evolutionary theories to highlight the causal process associated with population change. Research limitations/implications – The paper concludes by highlighting Geoffrey Hodgson’s Principle of Consistency. It is demonstrated that a failure to truly understand how and why theory is used in one domain will likely result in its misuse in another domain. That, at present, too few evolutionary concepts are employed in organisational research to ensure an appreciation of any underlying causal processes through which social change occurs. Originality/value – The concepts introduced throughout this paper, whilst not new, provide new entry points for organizational researchers intent on employing an evolutionary approach to understand the process of social change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mobile applications are being increasingly deployed on a massive scale in various mobile sensor grid database systems. With limited resources from the mobile devices, how to process the huge number of queries from mobile users with distributed sensor grid databases becomes a critical problem for such mobile systems. While the fundamental semantic cache technique has been investigated for query optimization in sensor grid database systems, the problem is still difficult due to the fact that more realistic multi-dimensional constraints have not been considered in existing methods. To solve the problem, a new semantic cache scheme is presented in this paper for location-dependent data queries in distributed sensor grid database systems. It considers multi-dimensional constraints or factors in a unified cost model architecture, determines the parameters of the cost model in the scheme by using the concept of Nash equilibrium from game theory, and makes semantic cache decisions from the established cost model. The scenarios of three factors of semantic, time and locations are investigated as special cases, which improve existing methods. Experiments are conducted to demonstrate the semantic cache scheme presented in this paper for distributed sensor grid database systems.