270 resultados para Rejection-sampling Algorithm
Resumo:
The 1951 Convention Relating to the Status of Refugees and the 1967 Protocol Relating to the Status of Refugees are the two primary international legal instruments that states use to process asylum seekers' claim to refugee status. However, in Southeast Asia only two states have acceded to these instruments. This is seemingly paradoxical for a region that has been host to a large number of asylum seekers who, as a result, are forced to live as ‘illegal migrants’. This book examines the region's continued rejection of international refugee law through extensive archival analysis and argues that this rejection was shaped by the region’s response to its largest refugee crisis in the post-1945 era: the Indochinese refugee crisis from 1975 to 1996. The result is a seminal study into Southeast Asian's relationship with international refugee law and the impact that this has had on states surrounding the region, the UNHCR and the asylum seekers themselves.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
A laboratory experiment was set up in small chambers for monitoring greenhouse gas emissions and determining the most suitable time for sampling. A six-treatment experiment was conducted, including a one week pre-incubation and a week for incubation. Timelines for sampling were 1, 2, 3, 6 and 24 hours after closing the lid of the incubation chambers. Variation in greenhouse gas fluxes was high due to the time of sampling. The rates of gas emissions increased in first three hours and decreased afterward. The rates of greenhouse gas emissions at 3 hours after closing lids was close to the mean for the 24-h period.
Resumo:
An improved Phase-Locked Loop (PLL) for extracting phase and frequency of the fundamental component of a highly distorted grid voltage is presented. The structure of the single-phase PLL is based on the Synchronous Reference Frame (SRF) PLL and uses an All Pass Filter (APF) to generate the quadrature component from the single phase input voltage. In order to filter the harmonic content, a Moving Average Filter (MAF) is used, and performance is improved by designing a lead compensator and also a feed-forward compensator. The simulation results are compared to show the improved performance with feed-forward. In addition, the frequency dependency of MAF is dealt with by a proposed method for adaption to the frequency. This method changes the window size based on the frequency on a sample-by-sample basis. By using this method, the speed of resizing can be reduced in order to decrease the output ripples caused by window size variations.
Resumo:
Live migration of multiple Virtual Machines (VMs) has become an integral management activity in data centers for power saving, load balancing and system maintenance. While state-of-the-art live migration techniques focus on the improvement of migration performance of an independent single VM, only a little has been investigated to the case of live migration of multiple interacting VMs. Live migration is mostly influenced by the network bandwidth and arbitrarily migrating a VM which has data inter-dependencies with other VMs may increase the bandwidth consumption and adversely affect the performances of subsequent migrations. In this paper, we propose a Random Key Genetic Algorithm (RKGA) that efficiently schedules the migration of a given set of VMs accounting both inter-VM dependency and data center communication network. The experimental results show that the RKGA can schedule the migration of multiple VMs with significantly shorter total migration time and total downtime compared to a heuristic algorithm.
Resumo:
The work presented in this report is aimed to implement a cost-effective offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. Understandably, the objectives of a practical optimisation problem are conflicting each other and the minimisation of one of them necessarily implies the impossibility to minimise the other ones. This leads to the need to find a set of optimal solutions for the problem; once such a set of available options is produced, the mission planning problem is reduced to a decision making problem for the mission specialists, who will choose the solution which best fit the requirements of the mission. The goal of this work is then to develop a Multi-Objective optimisation tool able to provide the mission specialists a set of optimal solutions for the inspection task amongst which the final trajectory will be chosen, given the environment data, the mission requirements and the definition of the objectives to minimise. All the possible optimal solutions of a Multi-Objective optimisation problem are said to form the Pareto-optimal front of the problem. For any of the Pareto-optimal solutions, it is impossible to improve one objective without worsening at least another one. Amongst a set of Pareto-optimal solutions, no solution is absolutely better than another and the final choice must be a trade-off of the objectives of the problem. Multi-Objective Evolutionary Algorithms (MOEAs) are recognised to be a convenient method for exploring the Pareto-optimal front of Multi-Objective optimization problems. Their efficiency is due to their parallelism architecture which allows to find several optimal solutions at each time
Resumo:
Effluent from sewage treatment plants has been associated with a range of pollutant effects. Depending on the influent composition and treatment processes the effluent may contain a myriad of different chemicals which makes monitoring very complex. In this study we aimed to monitor relatively polar organic pollutant mixtures using a combination of passive sampling techniques and a set of biochemistry based assays covering acute bacterial toxicity (Microtox™), phytotoxicity (Max-I-PAM assay) and genotoxicity (umuC assay). The study showed that all of the assays were able to detect effects in the samples and allowed a comparison of the two plants as well as a comparison between the two sampling periods. Distinct improvements in water quality were observed in one of the plants as result of an upgrade to a UV disinfection system, which improved from 24× sample enrichment required to induce a 50% response in the Microtox™ assay to 84×, from 30× sample enrichment to induce a 50% reduction in photosynthetic yield to 125×, and the genotoxicity observed in the first sampling period was eliminated. Thus we propose that biochemical assay techniques in combination with time integrated passive sampling can substantially contribute to the monitoring of polar organic toxicants in STP effluents.
Resumo:
Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.
Resumo:
Network Real-Time Kinematic (NRTK) is a technology that can provide centimeter-level accuracy positioning services in real time, and it is enabled by a network of Continuously Operating Reference Stations (CORS). The location-oriented CORS placement problem is an important problem in the design of a NRTK as it will directly affect not only the installation and operational cost of the NRTK, but also the quality of positioning services provided by the NRTK. This paper presents a Memetic Algorithm (MA) for the location-oriented CORS placement problem, which hybridizes the powerful explorative search capacity of a genetic algorithm and the efficient and effective exploitative search capacity of a local optimization. Experimental results have shown that the MA has better performance than existing approaches. In this paper we also conduct an empirical study about the scalability of the MA, effectiveness of the hybridization technique and selection of crossover operator in the MA.
Resumo:
Extracting frequent subtrees from the tree structured data has important applications in Web mining. In this paper, we introduce a novel canonical form for rooted labelled unordered trees called the balanced-optimal-search canonical form (BOCF) that can handle the isomorphism problem efficiently. Using BOCF, we define a tree structure guided scheme based enumeration approach that systematically enumerates only the valid subtrees. Finally, we present the balanced optimal search tree miner (BOSTER) algorithm based on BOCF and the proposed enumeration approach, for finding frequent induced subtrees from a database of labelled rooted unordered trees. Experiments on the real datasets compare the efficiency of BOSTER over the two state-of-the-art algorithms for mining induced unordered subtrees, HybridTreeMiner and UNI3. The results are encouraging.