823 resultados para (standard) interval arithmetic
Resumo:
Scalable video coding of H.264/AVC standard enables adaptive and flexible delivery for multiple devices and various network conditions. Only a few works have addressed the influence of different scalability parameters (frame rate, spatial resolution, and SNR) on the user perceived quality within a limited scope. In this paper, we have conducted an experiment of subjective quality assessment for video sequences encoded with H.264/SVC to gain a better understanding of the correlation between video content and UPQ at all scalable layers and the impact of rate-distortion method and different scalabilities on bitrate and UPQ. Findings from this experiment will contribute to a user-centered design of adaptive delivery of scalable video stream.
Resumo:
We aimed to investigate the naturally occurring horizontal plane movements of a head stabilized in a standard ophthalmic headrest and to analyze their magnitude, velocity, spectral characteristics, and correlation to the cardio pulmonary system. Two custom-made air-coupled highly accurate (±2 μm)ultrasound transducers were used to measure the displacements of the head in different horizontal directions with a sampling frequency of 100 Hz. Synchronously to the head movements, an electrocardiogram (ECG) signal was recorded. Three healthy subjects participated in the study. Frequency analysis of the recorded head movements and their velocities was carried out, and functions of coherence between the two displacements and the ECG signal were calculated. Frequency of respiration and the heartbeat were clearly visible in all recorded head movements. The amplitude of head displacements was typically in the range of ±100 μm. The first harmonic of the heartbeat (in the range of 2–3 Hz), rather than its principal frequency, was found to be the dominant frequency of both head movements and their velocities. Coherence analysis showed high interdependence between the considered signals for frequencies of up to 20 Hz. These findings may contribute to the design of better ophthalmic headrests and should help other studies in the decision making of whether to use a heavy headrest or a bite bar.
Resumo:
It is the purpose of this article to examine the means curently available to judges to achieve a workable balance between providing appropriate consumer protection to signatories of standard form contractors while still retaining adequate respect for the sanctity of contract, and, based on this analysis, to determine whether a significantly greater scope of contract (re)construction is likely to become the norm in most common law jurisdictions in the coming decades.
Resumo:
Purpose: The component modules in the standard BEAMnrc distribution may appear to be insufficient to model micro-multileaf collimators that have tri-faceted leaf ends and complex leaf profiles. This note indicates, however, that accurate Monte Carlo simulations of radiotherapy beams defined by a complex collimation device can be completed using BEAMnrc's standard VARMLC component module.---------- Methods: That this simple collimator model can produce spatially and dosimetrically accurate micro-collimated fields is illustrated using comparisons with ion chamber and film measurements of the dose deposited by square and irregular fields incident on planar, homogeneous water phantoms.---------- Results: Monte Carlo dose calculations for on- and off-axis fields are shown to produce good agreement with experimental values, even upon close examination of the penumbrae.--------- Conclusions: The use of a VARMLC model of the micro-multileaf collimator, along with a commissioned model of the associated linear accelerator, is therefore recommended as an alternative to the development or use of in-house or third-party component modules for simulating stereotactic radiotherapy and radiosurgery treatments. Simulation parameters for the VARMLC model are provided which should allow other researchers to adapt and use this model to study clinical stereotactic radiotherapy treatments.
Resumo:
This research explores the empirical association between takeover bid premium and acquired (purchased) goodwill, and tests whether the strength of the association changes after the passage of approved accounting standard AASB 1013 in Australia in 1988. AASB 1013 mandated capitalization and amortization of acquired goodwill to the income statement over a maximum period of 20 years. We use regressions to assess how the association between bid premium and acquired goodwill varies in the pre-AASB and post-AASB 1013 periods after controlling for confounding factors. Our results show that reducing the variety of accounting policy options available to bidder management after an acquisition results in a systematic reduction in the strength of the association between premium and goodwill.
Resumo:
Since the High Court decision of Cook v Cook (1986) 162 CLR 376, a person who voluntarily undertakes to instruct a learner driver of a motor vehicle is owed a lower standard of care than that owed to other road users. The standard of care was still expressed to be objective; however, it took into account the inexperience of the learner driver. Therefore, a person instructing a learner driver was owed a duty of care the standard being that of a reasonable learner driver. This ‘special relationship’ was said to exist because of the passenger’s knowledge of the driver’s inexperience and lack of skill. On 28 August 2008 the High Court handed down its decision in Imbree v McNeilly [2008] HCA 40, overruling Cook v Cook.
Resumo:
The most costly operations encountered in pairing computations are those that take place in the full extension field Fpk . At high levels of security, the complexity of operations in Fpk dominates the complexity of the operations that occur in the lower degree subfields. Consequently, full extension field operations have the greatest effect on the runtime of Miller’s algorithm. Many recent optimizations in the literature have focussed on improving the overall operation count by presenting new explicit formulas that reduce the number of subfield operations encountered throughout an iteration of Miller’s algorithm. Unfortunately, almost all of these improvements tend to suffer for larger embedding degrees where the expensive extension field operations far outweigh the operations in the smaller subfields. In this paper, we propose a new way of carrying out Miller’s algorithm that involves new explicit formulas which reduce the number of full extension field operations that occur in an iteration of the Miller loop, resulting in significant speed ups in most practical situations of between 5 and 30 percent.
Resumo:
Minimizing complexity of group key exchange (GKE) protocols is an important milestone towards their practical deployment. An interesting approach to achieve this goal is to simplify the design of GKE protocols by using generic building blocks. In this paper we investigate the possibility of founding GKE protocols based on a primitive called multi key encapsulation mechanism (mKEM) and describe advantages and limitations of this approach. In particular, we show how to design a one-round GKE protocol which satisfies the classical requirement of authenticated key exchange (AKE) security, yet without forward secrecy. As a result, we obtain the first one-round GKE protocol secure in the standard model. We also conduct our analysis using recent formal models that take into account both outsider and insider attacks as well as the notion of key compromise impersonation resilience (KCIR). In contrast to previous models we show how to model both outsider and insider KCIR within the definition of mutual authentication. Our analysis additionally implies that the insider security compiler by Katz and Shin from ACM CCS 2005 can be used to achieve more than what is shown in the original work, namely both outsider and insider KCIR.
Resumo:
We give a direct construction of a certificateless key encapsulation mechanism (KEM) in the standard model that is more efficient than the generic constructions proposed before by Huang and Wong \cite{DBLP:conf/acisp/HuangW07}. We use a direct construction from Kiltz and Galindo's KEM scheme \cite{DBLP:conf/acisp/KiltzG06} to obtain a certificateless KEM in the standard model; our construction is roughly twice as efficient as the generic construction. We also address the security flaw discovered by Selvi et al. \cite{cryptoeprint:2009:462}.
Resumo:
We show how to construct a certificateless key agreement protocol from the certificateless key encapsulation mechanism introduced by \cite{lippold-ICISC_2009} in ICISC 2009 using the \cite{DBLP:conf/acisp/BoydCNP08} protocol from ACISP 2008. We introduce the Canetti-Krawczyk (CK) model for certificateless cryptography, give security notions for Type I and Type II adversaries in the CK model, and highlight the differences to the existing e$^2$CK model discussed by \cite{DBLP:conf/pairing/LippoldBN09}. The resulting CK model is more relaxed thus giving more power to the adversary than the original CK model.
Resumo:
This paper examines the role of powerful entities and coalitions in shaping international accounting standards. Specifically, the focus is on the process by which the International Accounting Standards Board (IASB) developed IFRS 6, Exploration for and Evaluation of Mineral Resources. In its Issues Paper, the IASB recommended that the successful efforts method be mandated for pre-production costs, eliminating the choice previously available between full cost and successful efforts methods. In spite of the endorsement of this view by a majority of the constituents who responded to the Issues Paper, the final outcome changed nothing, with choice being retained. A compelling explanation of this disparity between the visible inputs and outputs of the standard setting process is the existence of a “black box”, in which powerful extractive industries entities and coalitions covertly influenced the IASB to secure their own ends and ensure that the status quo was maintained
Resumo:
Identifying crash “hotspots”, “blackspots”, “sites with promise”, or “high risk” locations is standard practice in departments of transportation throughout the US. The literature is replete with the development and discussion of statistical methods for hotspot identification (HSID). Theoretical derivations and empirical studies have been used to weigh the benefits of various HSID methods; however, a small number of studies have used controlled experiments to systematically assess various methods. Using experimentally derived simulated data—which are argued to be superior to empirical data, three hot spot identification methods observed in practice are evaluated: simple ranking, confidence interval, and Empirical Bayes. Using simulated data, sites with promise are known a priori, in contrast to empirical data where high risk sites are not known for certain. To conduct the evaluation, properties of observed crash data are used to generate simulated crash frequency distributions at hypothetical sites. A variety of factors is manipulated to simulate a host of ‘real world’ conditions. Various levels of confidence are explored, and false positives (identifying a safe site as high risk) and false negatives (identifying a high risk site as safe) are compared across methods. Finally, the effects of crash history duration in the three HSID approaches are assessed. The results illustrate that the Empirical Bayes technique significantly outperforms ranking and confidence interval techniques (with certain caveats). As found by others, false positives and negatives are inversely related. Three years of crash history appears, in general, to provide an appropriate crash history duration.
Resumo:
In this chapter, a rationale is developed for incorporating philosophy into teacher training programs as a means of both preparing quality teachers for the 21st century and meeting the expectations detailed in the professional standards established by the statutory authority that regulates the profession in Queensland, the Queensland College of Teaching is presented. Furthermore, in-service teachers from Buranda State School, a Brisbane primary school that has been successfully teaching philosophy to its students for over 10 years, shares their experiences of teaching philosophy and how it has enhanced student learning and the quality of teaching and professionalism of the teachers. Finally, the implications of embedding philosophy into teacher training programs are explored in terms of developing the personal integrity of beginning teachers.