615 resultados para least privilege


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we pursue the task of aligning an ensemble of images in an unsupervised manner. This task has been commonly referred to as “congealing” in literature. A form of congealing, using a least-squares criteria, has been recently demonstrated to have desirable properties over conventional congealing. Least-squares congealing can be viewed as an extension of the Lucas & Kanade (LK)image alignment algorithm. It is well understood that the alignment performance for the LK algorithm, when aligning a single image with another, is theoretically and empirically equivalent for additive and compositional warps. In this paper we: (i) demonstrate that this equivalence does not hold for the extended case of congealing, (ii) characterize the inherent drawbacks associated with least-squares congealing when dealing with large numbers of images, and (iii) propose a novel method for circumventing these limitations through the application of an inverse-compositional strategy that maintains the attractive properties of the original method while being able to handle very large numbers of images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information Technology (IT) is an important resource that can facilitate growth and development in both the developed and developing economies. The forces of globalisation increase the digital divide between the developed and developing economies is increasing. The least developed economies (LDEs) are the most vulnerable within this environment. Intense competition for IT resources means that LDEs need a deeper understanding of how to source and evaluate their IT-related efforts. This effort puts LDEs in a better position to source funding from various stakeholders and promote localized investment in IT. This study presents a complementary approach to securing better IT-related business value in organizations in the LDEs. It further evaluates how IT and the complementaries need to managed within the LDEs. Analysis of data collected from five LDEs show that organizations that invest in IT and related complementaries are able to better their business processes. The data also suggest that improved business processes lead to overall business processes improvements. The above is only possible if organizations adopt IT and make related changes in the complementary resources within the established culture and localizing the required changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vertex-centred finite volume method (FVM) for the Cahn-Hilliard (CH) and recently proposed Cahn-Hilliard-reaction (CHR) equations is presented. Information at control volume faces is computed using a high-order least-squares approach based on Taylor series approximations. This least-squares problem explicitly includes the variational boundary condition (VBC) that ensures that the discrete equations satisfy all of the boundary conditions. We use this approach to solve the CH and CHR equations in one and two dimensions and show that our scheme satisfies the VBC to at least second order. For the CH equation we show evidence of conservative, gradient stable solutions, however for the CHR equation, strict gradient-stability is more challenging to achieve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decision in ASIC v Managed Investments Ltd No 3 [2012] QSC 74 provides practitioners with useful guidance on the relationship between the privileges against self-incrimination and exposure to a penalty, and the UCPR requirements for denials and non-admissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of ambiguity resolution (AR) of Global Navigation Satellite Systems (GNSS), decorrelation among entries of an ambiguity vector, integer ambiguity search and ambiguity validations are three standard procedures for solving integer least-squares problems. This paper contributes to AR issues from three aspects. Firstly, the orthogonality defect is introduced as a new measure of the performance of ambiguity decorrelation methods, and compared with the decorrelation number and with the condition number which are currently used as the judging criterion to measure the correlation of ambiguity variance-covariance matrix. Numerically, the orthogonality defect demonstrates slightly better performance as a measure of the correlation between decorrelation impact and computational efficiency than the condition number measure. Secondly, the paper examines the relationship of the decorrelation number, the condition number, the orthogonality defect and the size of the ambiguity search space with the ambiguity search candidates and search nodes. The size of the ambiguity search space can be properly estimated if the ambiguity matrix is decorrelated well, which is shown to be a significant parameter in the ambiguity search progress. Thirdly, a new ambiguity resolution scheme is proposed to improve ambiguity search efficiency through the control of the size of the ambiguity search space. The new AR scheme combines the LAMBDA search and validation procedures together, which results in a much smaller size of the search space and higher computational efficiency while retaining the same AR validation outcomes. In fact, the new scheme can deal with the case there are only one candidate, while the existing search methods require at least two candidates. If there are more than one candidate, the new scheme turns to the usual ratio-test procedure. Experimental results indicate that this combined method can indeed improve ambiguity search efficiency for both the single constellation and dual constellations respectively, showing the potential for processing high dimension integer parameters in multi-GNSS environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the corporate regulation landscape, 'meta-regulation' is a comparatively new legal approach. The sketchy role of state promulgated authoritative laws in pluralized society and scepticism in corporate self-regulation's role have resulted in the development of this legal approach. It has opened up possibilities to synthesize corporate governance to add social values in corporate self-regulation. The core of this approach is the fusion of responsive and reflexive legal strategies to combine regulators and regulatees for reaching a particular goal. This paper argues that it is a potential strategy that can be successfully deployed to develop a socially responsible corporate culture for the business enterprises, so that they will be able to acquire social, environmental and ethical values in their self-regulation sustainably. Taking Bangladeshi corporate laws as an instance, this paper also evaluates the scope of incorporating this approach in laws of the least developed common law countries in general.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a computationally efficient image border pixel based watermark embedding scheme for medical images. We considered the border pixels of a medical image as RONI (region of non-interest), since those pixels have no or little interest to doctors and medical professionals irrespective of the image modalities. Although RONI is used for embedding, our proposed scheme still keeps distortion at a minimum level in the embedding region using the optimum number of least significant bit-planes for the border pixels. All these not only ensure that a watermarked image is safe for diagnosis, but also help minimize the legal and ethical concerns of altering all pixels of medical images in any manner (e.g, reversible or irreversible). The proposed scheme avoids the need for RONI segmentation, which incurs capacity and computational overheads. The performance of the proposed scheme has been compared with a relevant scheme in terms of embedding capacity, image perceptual quality (measured by SSIM and PSNR), and computational efficiency. Our experimental results show that the proposed scheme is computationally efficient, offers an image-content-independent embedding capacity, and maintains a good image quality

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To identify a 15-KDa novel hypoxia-induced secreted protein in head and neck squamous cell carcinomas (HNSCC) and to determine its role in malignant progression. Methods: We used surface-enhanced laser desorption ionization time-of-flight mass spectrometry (SELDI-TOF-MS) and tandem MS to identify a novel hypoxia-induced secreted protein in FaDu cells. We used immunoblots, real-time polymerase chain reaction (PCR), and enzyme-linked immunoabsorbent assay to confirm the hypoxic induction of this secreted protein as galectin-1 in cell lines and xenografts. We stained tumor tissues from 101 HNSCC patients for galectin-1, CA IX (carbonic anhydrase IX, a hypoxia marker) and CDS (a T-cell marker). Expression of these markers was correlated to each other and to treatment outcomes. Results: SELDI-TOF studies yielded a hypoxia-induced peak at 15 kDa that proved to be galectin-1 by MS analysis. Immunoblots and PCR studies confirmed increased galectin-1 expression by hypoxia in several cancer cell lines. Plasma levels of galectin-1 were higher in tumor-bearing severe combined immunodeficiency (SCID) mice breathing 10% O 2 compared with mice breathing room air. In HNSCC patients, there was a significant correlation between galectin-1 and CA IX staining (P = .01) and a strong inverse correlation between galectin-1 and CDS staining (P = .01). Expression of galectin-1 and CDS were significant predictors for overall survival on multivariate analysis. Conclusion: Galectin-1 is a novel hypoxia-regulated protein and a prognostic marker in HNSCC. This study presents a new mechanism on how hypoxia can affect the malignant progression and therapeutic response of solid tumors by regulating the secretion of proteins that modulate immune privilege. © 2005 by American Society of Clinical Oncology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the health impacts caused by exposures to air pollutants in urban areas, monitoring and forecasting of air quality parameters have become popular as an important topic in atmospheric and environmental research today. The knowledge on the dynamics and complexity of air pollutants behavior has made artificial intelligence models as a useful tool for a more accurate pollutant concentration prediction. This paper focuses on an innovative method of daily air pollution prediction using combination of Support Vector Machine (SVM) as predictor and Partial Least Square (PLS) as a data selection tool based on the measured values of CO concentrations. The CO concentrations of Rey monitoring station in the south of Tehran, from Jan. 2007 to Feb. 2011, have been used to test the effectiveness of this method. The hourly CO concentrations have been predicted using the SVM and the hybrid PLS–SVM models. Similarly, daily CO concentrations have been predicted based on the aforementioned four years measured data. Results demonstrated that both models have good prediction ability; however the hybrid PLS–SVM has better accuracy. In the analysis presented in this paper, statistic estimators including relative mean errors, root mean squared errors and the mean absolute relative error have been employed to compare performances of the models. It has been concluded that the errors decrease after size reduction and coefficients of determination increase from 56 to 81% for SVM model to 65–85% for hybrid PLS–SVM model respectively. Also it was found that the hybrid PLS–SVM model required lower computational time than SVM model as expected, hence supporting the more accurate and faster prediction ability of hybrid PLS–SVM model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical approach is used in the design of a battery-supercapacitor energy storage system for a wind farm. The design exploits the technical merits of the two energy storage mediums, in terms of the differences in their specific power and energy densities, and their ability to accommodate different rates of change in the charging/discharging powers. By treating the input wind power as random and using a proposed coordinated power flows control strategy for the battery and the supercapacitor, the approach evaluates the energy storage capacities, the corresponding expected life cycle cost/year of the storage mediums, and the expected cost/year of unmet power dispatch. A computational procedure is then developed for the design of a least-cost/year hybrid energy storage system to realize wind power dispatch at a specified confidence level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper provides a systematic approach to designing the laboratory phase of a multiphase experiment, taking into account previous phases. General principles are outlined for experiments in which orthogonal designs can be employed. Multiphase experiments occur widely, although their multiphase nature is often not recognized. The need to randomize the material produced from the first phase in the laboratory phase is emphasized. Factor-allocation diagrams are used to depict the randomizations in a design and the use of skeleton analysis-of-variance (ANOVA) tables to evaluate their properties discussed. The methods are illustrated using a scenario and a case study. A basis for categorizing designs is suggested. This article has supplementary material online.