342 resultados para multiple imputation
Resumo:
A multi-secret sharing scheme allows several secrets to be shared amongst a group of participants. In 2005, Shao and Cao developed a verifiable multi-secret sharing scheme where each participant’s share can be used several times which reduces the number of interactions between the dealer and the group members. In addition some secrets may require a higher security level than others involving the need for different threshold values. Recently Chan and Chang designed such a scheme but their construction only allows a single secret to be shared per threshold value. In this article we combine the previous two approaches to design a multiple time verifiable multi-secret sharing scheme where several secrets can be shared for each threshold value. Since the running time is an important factor for practical applications, we will provide a complexity comparison of our combined approach with respect to the previous schemes.
Resumo:
The interest in utilising multiple heterogeneous Unmanned Aerial Vehicles (UAVs) in close proximity is growing rapidly. As such, many challenges are presented in the effective coordination and management of these UAVs; converting the current n-to-1 paradigm (n operators operating a single UAV) to the 1-to-n paradigm (one operator managing n UAVs). This paper introduces an Information Abstraction methodology used to produce the functional capability framework initially proposed by Chen et al. and its Level Of Detail (LOD) indexing scale. This framework was validated through comparing the operator workload and Situation Awareness (SA) of three experiment scenarios involving multiple autonomously heterogeneous UAVs. The first scenario was set in a high LOD configuration with highly abstracted UAV functional information; the second scenario was set in a mixed LOD configuration; and the final scenario was set in a low LOD configuration with maximal UAV functional information. Results show that there is a significant statistical decrease in operator workload when a UAV’s functional information is displayed at its physical form (low LOD - maximal information) when comparing to the mixed LOD configuration.
Resumo:
Mutations in the genes encoding for either the biosynthetic or transcriptional regulation of the anthocyanin pathway have been linked to color phenotypes. Generally, this is a loss of function resulting in a reduction or a change in the distribution of anthocyanin. Here, we describe a rearrangement in the upstream regulatory region of the gene encoding an apple (Malus x domestica) anthocyanin-regulating transcription factor, MYB10. We show that this modification is responsible for increasing the level of anthocyanin throughout the plant to produce a striking phenotype that includes red foliage and red fruit flesh. This rearrangement is a series of multiple repeats, forming a minisatellite-like structure that comprises five direct tandem repeats of a 23-bp sequence. This MYB10 rearrangement is present in all the red foliage apple varieties and species tested but in none of the white fleshed varieties. Transient assays demonstrated that the 23-bp sequence motif is a target of the MYB10 protein itself, and the number of repeat units correlates with an increase in transactivation by MYB10 protein. We show that the repeat motif is capable of binding MYB10 protein in electrophoretic mobility shift assays. Taken together, these results indicate that an allelic rearrangement in the promoter of MYB10 has generated an autoregulatory locus, and this autoregulation is sufficient to account for the increase in MYB10 transcript levels and subsequent ectopic accumulation of anthocyanins throughout the plant.
Resumo:
This paper presents a novel framework to further advance the recent trend of using query decomposition and high-order term relationships in query language modeling, which takes into account terms implicitly associated with different subsets of query terms. Existing approaches, most remarkably the language model based on the Information Flow method are however unable to capture multiple levels of associations and also suffer from a high computational overhead. In this paper, we propose to compute association rules from pseudo feedback documents that are segmented into variable length chunks via multiple sliding windows of different sizes. Extensive experiments have been conducted on various TREC collections and our approach significantly outperforms a baseline Query Likelihood language model, the Relevance Model and the Information Flow model.
Resumo:
It has been 21 years since the decision in Rogers v Whitaker and the legal principles concerning informed consent and liability for negligence are still strongly grounded in this landmark High Court decision. This paper considers more recent developments in the law concerning the failure to disclose inherent risks in medical procedures, focusing on the decision in Wallace v Kam [2013] HCA 19. In this case, the appellant underwent a surgical procedure that carried a number of risks. The surgery itself was not performed in a sub-standard way, but the surgeon failed to disclose two risks to the patient, a failure that constituted a breach of the surgeon’s duty of care in negligence. One of the undisclosed risks was considered to be less serious than the other, and this lesser risk eventuated causing injury to the appellant. The more serious risk did not eventuate, but the appellant argued that if the more serious risk had been disclosed, he would have avoided his injuries completely because he would have refused to undergo the procedure. Liability was disputed by the surgeon, with particular reference to causation principles. The High Court of Australia held that the appellant should not be compensated for harm that resulted from a risk he would have been willing to run. We examine the policy reasons underpinning the law of negligence in this specific context and consider some of the issues raised by this unusual case. We question whether some of the judicial reasoning adopted in this case, represents a significant shift in traditional causation principles.
Resumo:
Voltage rise is the main issue which limits the capacity of Low Voltage (LV) network to accommodate more Renewable Energy (RE) sources. In addition, voltage drop at peak load period is a significant power quality concern. This paper proposes a new robust voltage support strategy based on distributed coordination of multiple distribution static synchronous compensators (DSTATCOMs). The study focuses on LV networks with PV as the RE source for customers. The proposed approach applied to a typical LV network and its advantages are shown comparing with other voltage control strategies.
Resumo:
This paper addresses the topic of real-time decision making for autonomous city vehicles, i.e., the autonomous vehicles' ability to make appropriate driving decisions in city road traffic situations. The paper explains the overall controls system architecture, the decision making task decomposition, and focuses on how Multiple Criteria Decision Making (MCDM) is used in the process of selecting the most appropriate driving maneuver from the set of feasible ones. Experimental tests show that MCDM is suitable for this new application area.
Resumo:
This paper presents a method to enable a mobile robot working in non-stationary environments to plan its path and localize within multiple map hypotheses simultaneously. The maps are generated using a long-term and short-term memory mechanism that ensures only persistent configurations in the environment are selected to create the maps. In order to evaluate the proposed method, experimentation is conducted in an office environment. Compared to navigation systems that use only one map, our system produces superior path planning and navigation in a non-stationary environment where paths can be blocked periodically, a common scenario which poses significant challenges for typical planners.
Resumo:
The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.
Resumo:
This paper introduces a straightforward method to asymptotically solve a variety of initial and boundary value problems for singularly perturbed ordinary differential equations whose solution structure can be anticipated. The approach is simpler than conventional methods, including those based on asymptotic matching or on eliminating secular terms. © 2010 by the Massachusetts Institute of Technology.
Resumo:
NLS is a stream cipher which was submitted to the eSTREAM project. A linear distinguishing attack against NLS was presented by Cho and Pieprzyk, which was called Crossword Puzzle (CP) attack. NLSv2 is a tweak version of NLS which aims mainly at avoiding the CP attack. In this paper, a new distinguishing attack against NLSv2 is presented. The attack exploits high correlation amongst neighboring bits of the cipher. The paper first shows that the modular addition preserves pairwise correlations as demonstrated by existence of linear approximations with large biases. Next, it shows how to combine these results with the existence of high correlation between bits 29 and 30 of the S-box to obtain a distinguisher whose bias is around 2^−37. Consequently, we claim that NLSv2 is distinguishable from a random cipher after observing around 2^74 keystream words.
Resumo:
The value of information technology (IT) is often realized when continuously being used after users’ initial acceptance. However, previous research on continuing IT usage is limited for dismissing the importance of mental goals in directing users’ behaviors and for inadequately accommodating the group context of users. This in-progress paper offers a synthesis of several literature to conceptualize continuing IT usage as multilevel constructs and to view IT usage behavior as directed and energized by a set of mental goals. Drawing from the self-regulation theory in the social psychology, this paper proposes a process model, positioning continuing IT usage as multiple-goal pursuit. An agent-based modeling approach is suggested to further explore causal and analytical implications of the proposed process model.
Resumo:
The ability to build high-fidelity 3D representations of the environment from sensor data is critical for autonomous robots. Multi-sensor data fusion allows for more complete and accurate representations. Furthermore, using distinct sensing modalities (i.e. sensors using a different physical process and/or operating at different electromagnetic frequencies) usually leads to more reliable perception, especially in challenging environments, as modalities may complement each other. However, they may react differently to certain materials or environmental conditions, leading to catastrophic fusion. In this paper, we propose a new method to reliably fuse data from multiple sensing modalities, including in situations where they detect different targets. We first compute distinct continuous surface representations for each sensing modality, with uncertainty, using Gaussian Process Implicit Surfaces (GPIS). Second, we perform a local consistency test between these representations, to separate consistent data (i.e. data corresponding to the detection of the same target by the sensors) from inconsistent data. The consistent data can then be fused together, using another GPIS process, and the rest of the data can be combined as appropriate. The approach is first validated using synthetic data. We then demonstrate its benefit using a mobile robot, equipped with a laser scanner and a radar, which operates in an outdoor environment in the presence of large clouds of airborne dust and smoke.
Resumo:
This paper presents a novel method to rank map hypotheses by the quality of localization they afford. The highest ranked hypothesis at any moment becomes the active representation that is used to guide the robot to its goal location. A single static representation is insufficient for navigation in dynamic environments where paths can be blocked periodically, a common scenario which poses significant challenges for typical planners. In our approach we simultaneously rank multiple map hypotheses by the influence that localization in each of them has on locally accurate odometry. This is done online for the current locally accurate window by formulating a factor graph of odometry relaxed by localization constraints. Comparison of the resulting perturbed odometry of each hypothesis with the original odometry yields a score that can be used to rank map hypotheses by their utility. We deploy the proposed approach on a real robot navigating a structurally noisy office environment. The configuration of the environment is physically altered outside the robots sensory horizon during navigation tasks to demonstrate the proposed approach of hypothesis selection.
Resumo:
Live migration of multiple Virtual Machines (VMs) has become an integral management activity in data centers for power saving, load balancing and system maintenance. While state-of-the-art live migration techniques focus on the improvement of migration performance of an independent single VM, only a little has been investigated to the case of live migration of multiple interacting VMs. Live migration is mostly influenced by the network bandwidth and arbitrarily migrating a VM which has data inter-dependencies with other VMs may increase the bandwidth consumption and adversely affect the performances of subsequent migrations. In this paper, we propose a Random Key Genetic Algorithm (RKGA) that efficiently schedules the migration of a given set of VMs accounting both inter-VM dependency and data center communication network. The experimental results show that the RKGA can schedule the migration of multiple VMs with significantly shorter total migration time and total downtime compared to a heuristic algorithm.