894 resultados para Unified Model Reference


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides an overview of the Sri Lankan civil war with a view to identifying some of the factors that contributed to the dispute between the Sri Lankan government and the Liberation Tigers of Tamil Eelam. It adopts a multi-causal explanation of the conflict by reference to the theories of social power developed by Michael Mann. The conflict has been variously described as an ethnic or political conflict, or has been characterised as a determined by a number of interacting factors (including colonialism, ethnicity, religion, economy, politics and globalisation). Mann’s four-dimensional model of social power is deployed to analyse the causal relationships, together with their inter-connections, which clarify the origins of the dispute. It argues that Mann’s theoretical framework helps to highlight some of the interconnected elements that contributed to the conflict.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a unified sequential Monte Carlo (SMC) framework for performing sequential experimental design for discriminating between a set of models. The model discrimination utility that we advocate is fully Bayesian and based upon the mutual information. SMC provides a convenient way to estimate the mutual information. Our experience suggests that the approach works well on either a set of discrete or continuous models and outperforms other model discrimination approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An encryption scheme is non-malleable if giving an encryption of a message to an adversary does not increase its chances of producing an encryption of a related message (under a given public key). Fischlin introduced a stronger notion, known as complete non-malleability, which requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti later proposed a comparison-based definition of this security notion, which is more in line with the well-studied definitions proposed by Bellare et al. The authors also provide additional feasibility results by proposing two constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Therefore, the only previously known completely non-malleable (and non-interactive) scheme in the standard model, is quite inefficient as it relies on generic NIZK approach. They left the existence of efficient schemes in the common reference string model as an open problem. Recently, two efficient public-key encryption schemes have been proposed by Libert and Yung, and Barbosa and Farshim, both of them are based on pairing identity-based encryption. At ACISP 2011, Sepahi et al. proposed a method to achieve completely non-malleable encryption in the public-key setting using lattices but there is no security proof for the proposed scheme. In this paper we review the mentioned scheme and provide its security proof in the standard model. Our study shows that Sepahi’s scheme will remain secure even for post-quantum world since there are currently no known quantum algorithms for solving lattice problems that perform significantly better than the best known classical (i.e., non-quantum) algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents ongoing work toward constructing efficient completely non-malleable public-key encryption scheme based on lattices in the standard (common reference string) model. An encryption scheme is completely non-malleable if it requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti proposed two inefficient constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Recently, two efficient public-key encryption schemes have been proposed, both of them are based on pairing identity-based encryption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation is an important issue in the development and application of Bayesian Belief Network (BBN) models, especially when the outcome of the model cannot be directly observed. Despite this, few frameworks for validating BBNs have been proposed and fewer have been applied to substantive real-world problems. In this paper we adopt the approach by Pitchforth and Mengersen (2013), which includes nine validation tests that each focus on the structure, discretisation, parameterisation and behaviour of the BBNs included in the case study. We describe the process and result of implementing a validation framework on a model of a real airport terminal system with particular reference to its effectiveness in producing a valid model that can be used and understood by operational decision makers. In applying the proposed validation framework we demonstrate the overall validity of the Inbound Passenger Facilitation Model as well as the effectiveness of the validity framework itself.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While formal definitions and security proofs are well established in some fields like cryptography and steganography, they are not as evident in digital watermarking research. A systematic development of watermarking schemes is desirable, but at present their development is usually informal, ad hoc, and omits the complete realization of application scenarios. This practice not only hinders the choice and use of a suitable scheme for a watermarking application, but also leads to debate about the state-of-the-art for different watermarking applications. With a view to the systematic development of watermarking schemes, we present a formal generic model for digital image watermarking. Considering possible inputs, outputs, and component functions, the initial construction of a basic watermarking model is developed further to incorporate the use of keys. On the basis of our proposed model, fundamental watermarking properties are defined and their importance exemplified for different image applications. We also define a set of possible attacks using our model showing different winning scenarios depending on the adversary capabilities. It is envisaged that with a proper consideration of watermarking properties and adversary actions in different image applications, use of the proposed model would allow a unified treatment of all practically meaningful variants of watermarking schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Curriculum scholars and teachers working for social justice and equity have been caught up in acrimonious and polarizing debates over content, ideology and disciplinary knowledge. At the forefront in cutting through these debates and addressing the practical questions involved, this book is distinctive in looking to the technical form of the curriculum rather than its content for solutions. The editors and contributors, all leading international scholars, advance a unified, principled approach to the design of syllabus documents that aims for high quality/high equity educational outcomes and enhances teacher professionalism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An increasing concern over the sustainability credentials of food and fiber crops require that farmers and their supply chain partners have access to appropriate and industry-friendly tools to be able to measure and improve the outcomes. This article focuses on one of the sustainability indicators, namely, greenhouse gas (GHG) emissions, and nine internationally accredited carbon footprint calculators were identified and compared on an outcomes basis against the same cropping data from a case study cotton farm. The purpose of this article is to identify the most “appropriate” methodology to be applied by cotton suppliers in this regard. From the analysis of the results, we subsequently propose a new integrated model as the basis for an internationally accredited carbon footprint tool for cotton and show how the model can be applied to evaluate the emission outcomes of different farming practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nature of construction projects and their delivery exposes participants to accidents and dangers. Safety climate serves as a frame of reference for employees to make sense of safety measures in the workplace and adapt their behaviors. Though safety climate research abounds, fewer efforts are made to investigate the formation of a safety climate. An effort to explore forming psychological safety climate, an operationalization of safety climate at the individual level, is an appropriate starting point. Taking the view that projects are social processes, this paper develops a conceptual framework of forming the psychological safety climate, and provides a preliminary validation. The model suggests that management can create the desired psychological safety climate by efforts from structural, perceptual, interactive, and cultural perspectives. Future empirical research can be built on the model to provide a more comprehensive and coherent picture of the determinants of safety climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unified communications as a service (UCaaS) can be regarded as a cost-effective model for on-demand delivery of unified communications services in the cloud. However, addressing security concerns has been seen as the biggest challenge to the adoption of IT services in the cloud. This study set up a cloud system via VMware suite to emulate hosting unified communications (UC), the integration of two or more real time communication systems, services in the cloud in a laboratory environment. An Internet Protocol Security (IPSec) gateway was also set up to support network-level security for UCaaS against possible security exposures. This study was aimed at analysis of an implementation of UCaaS over IPSec and evaluation of the latency of encrypted UC traffic while protecting that traffic. Our test results show no latency while IPSec is implemented with a G.711 audio codec. However, the performance of the G.722 audio codec with an IPSec implementation affects the overall performance of the UC server. These results give technical advice and guidance to those involved in security controls in UC security on premises as well as in the cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides two main contributions. The first one is BP-TRBAC, a unified authorisation model that can support legacy systems as well as business process systems. BP-TRBAC supports specific features that are required by business process environments. BP-TRBAC is designed to be used as an independent enterprise-wide authorisation model, rather than having it as part of the workflow system. It is designed to be the main authorisation model for an organisation. The second contribution is BP-XACML, an authorisation policy language that is designed to represent BPM authorisation policies for business processes. The contribution also includes a policy model for BP-XACML. Using BP-TRBAC as an authorisation model together with BP-XACML as an authorisation policy language will allow an organisation to manage and control authorisation requests from workflow systems and other legacy systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the dimensional reduction regularization scheme, we show that radiative corrections to the anomaly of the axial current, which is coupled to the gauge field, are absent in a supersymmetric U(1) gauge model for both 't Hooft-Veltman and Bardeen prescriptions for γ5. We also discuss the results with reference to conventional dimensional regularization. This result has significant implications with respect to the renormalizability of supersymmetric models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The utility of near infrared spectroscopy as a non-invasive technique for the assessment of internal eating quality parameters of mandarin fruit (Citrus reticulata cv. Imperial) was assessed. The calibration procedure for the attributes of TSS (total soluble solids) and DM (dry matter) was optimised with respect to a reference sampling technique, scan averaging, spectral window, data pre-treatment (in terms of derivative treatment and scatter correction routine) and regression procedure. The recommended procedure involved sampling of an equatorial position on the fruit with 1 scan per spectrum, and modified partial least squares model development on a 720–950-nm window, pre-treated as first derivative absorbance data (gap size of 4 data points) with standard normal variance and detrend scatter correction. Calibration model performance for the attributes of TSS and DM content was encouraging (typical Rc2 of >0.75 and 0.90, respectively; typical root mean squared standard error of calibration of <0.4 and 0.6%, respectively), whereas that for juiciness and total acidity was unacceptable. The robustness of the TSS and DM calibrations across new populations of fruit is documented in a companion study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compared daily net radiation (Rn) estimates from 19 methods with the ASCE-EWRI Rn estimates in two climates: Clay Center, Nebraska (sub-humid) and Davis, California (semi-arid) for the calendar year. The performances of all 20 methods, including the ASCE-EWRI Rn method, were then evaluated against Rn data measured over a non-stressed maize canopy during two growing seasons in 2005 and 2006 at Clay Center. Methods differ in terms of inputs, structure, and equation intricacy. Most methods differ in estimating the cloudiness factor, emissivity (e), and calculating net longwave radiation (Rnl). All methods use albedo (a) of 0.23 for a reference grass/alfalfa surface. When comparing the performance of all 20 Rn methods with measured Rn, we hypothesized that the a values for grass/alfalfa and non-stressed maize canopy were similar enough to only cause minor differences in Rn and grass- and alfalfa-reference evapotranspiration (ETo and ETr) estimates. The measured seasonal average a for the maize canopy was 0.19 in both years. Using a = 0.19 instead of a = 0.23 resulted in 6% overestimation of Rn. Using a = 0.19 instead of a = 0.23 for ETo and ETr estimations, the 6% difference in Rn translated to only 4% and 3% differences in ETo and ETr, respectively, supporting the validity of our hypothesis. Most methods had good correlations with the ASCE-EWRI Rn (r2 > 0.95). The root mean square difference (RMSD) was less than 2 MJ m-2 d-1 between 12 methods and the ASCE-EWRI Rn at Clay Center and between 14 methods and the ASCE-EWRI Rn at Davis. The performance of some methods showed variations between the two climates. In general, r2 values were higher for the semi-arid climate than for the sub-humid climate. Methods that use dynamic e as a function of mean air temperature performed better in both climates than those that calculate e using actual vapor pressure. The ASCE-EWRI-estimated Rn values had one of the best agreements with the measured Rn (r2 = 0.93, RMSD = 1.44 MJ m-2 d-1), and estimates were within 7% of the measured Rn. The Rn estimates from six methods, including the ASCE-EWRI, were not significantly different from measured Rn. Most methods underestimated measured Rn by 6% to 23%. Some of the differences between measured and estimated Rn were attributed to the poor estimation of Rnl. We conducted sensitivity analyses to evaluate the effect of Rnl on Rn, ETo, and ETr. The Rnl effect on Rn was linear and strong, but its effect on ETo and ETr was subsidiary. Results suggest that the Rn data measured over green vegetation (e.g., irrigated maize canopy) can be an alternative Rn data source for ET estimations when measured Rn data over the reference surface are not available. In the absence of measured Rn, another alternative would be using one of the Rn models that we analyzed when all the input variables are not available to solve the ASCE-EWRI Rn equation. Our results can be used to provide practical information on which method to select based on data availability for reliable estimates of daily Rn in climates similar to Clay Center and Davis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug Analysis without Primary Reference Standards: Application of LC-TOFMS and LC-CLND to Biofluids and Seized Material Primary reference standards for new drugs, metabolites, designer drugs or rare substances may not be obtainable within a reasonable period of time or their availability may also be hindered by extensive administrative requirements. Standards are usually costly and may have a limited shelf life. Finally, many compounds are not available commercially and sometimes not at all. A new approach within forensic and clinical drug analysis involves substance identification based on accurate mass measurement by liquid chromatography coupled with time-of-flight mass spectrometry (LC-TOFMS) and quantification by LC coupled with chemiluminescence nitrogen detection (LC-CLND) possessing equimolar response to nitrogen. Formula-based identification relies on the fact that the accurate mass of an ion from a chemical compound corresponds to the elemental composition of that compound. Single-calibrant nitrogen based quantification is feasible with a nitrogen-specific detector since approximately 90% of drugs contain nitrogen. A method was developed for toxicological drug screening in 1 ml urine samples by LC-TOFMS. A large target database of exact monoisotopic masses was constructed, representing the elemental formulae of reference drugs and their metabolites. Identification was based on matching the sample component s measured parameters with those in the database, including accurate mass and retention time, if available. In addition, an algorithm for isotopic pattern match (SigmaFit) was applied. Differences in ion abundance in urine extracts did not affect the mass accuracy or the SigmaFit values. For routine screening practice, a mass tolerance of 10 ppm and a SigmaFit tolerance of 0.03 were established. Seized street drug samples were analysed instantly by LC-TOFMS and LC-CLND, using a dilute and shoot approach. In the quantitative analysis of amphetamine, heroin and cocaine findings, the mean relative difference between the results of LC-CLND and the reference methods was only 11%. In blood specimens, liquid-liquid extraction recoveries for basic lipophilic drugs were first established and the validity of the generic extraction recovery-corrected single-calibrant LC-CLND was then verified with proficiency test samples. The mean accuracy was 24% and 17% for plasma and whole blood samples, respectively, all results falling within the confidence range of the reference concentrations. Further, metabolic ratios for the opioid drug tramadol were determined in a pharmacogenetic study setting. Extraction recovery estimation, based on model compounds with similar physicochemical characteristics, produced clinically feasible results without reference standards.