149 resultados para Dual-path
Resumo:
Methods are presented for the production, affinity purification and analysis of plasmid DNA (pDNA). Batch fermentation is used for the production of the pDNA, and expanded bed chromatography, via the use of a dual affinity glutathione S-transferase (GST) fusion protein, is used for the capture and purification of the pDNA. The protein is composed of GST, which displays affinity for glutathione immobilized to a solid-phase adsorbent, fused to a zinc finger transcription factor, which displays affinity for a target 9-base pair sequence contained within the target pDNA. A Picogreen™ fluorescence assay and/or anx ethidium bromide agarose gel electrophoresis assay can be used to analyze the eluted pDNA.
Resumo:
This paper presents an extension to the Rapidly-exploring Random Tree (RRT) algorithm applied to autonomous, drifting underwater vehicles. The proposed algorithm is able to plan paths that guarantee convergence in the presence of time-varying ocean dynamics. The method utilizes 4-Dimensional, ocean model prediction data as an evolving basis for expanding the tree from the start location to the goal. The performance of the proposed method is validated through Monte-Carlo simulations. Results illustrate the importance of the temporal variance in path execution, and demonstrate the convergence guarantee of the proposed methods.
Resumo:
There is a need for systems which can autonomously perform coverage tasks on large outdoor areas. Unfortunately, the state-of-the-art is to use GPS based localization, which is not suitable for precise operations near trees and other obstructions. In this paper we present a robotic platform for autonomous coverage tasks. The system architecture integrates laser based localization and mapping using the Atlas Framework with Rapidly-Exploring Random Trees path planning and Virtual Force Field obstacle avoidance. We demonstrate the performance of the system in simulation as well as with real world experiments.
Resumo:
One of the main challenges facing online and offline path planners is the uncertainty in the magnitude and direction of the environmental energy because it is dynamic, changeable with time, and hard to forecast. This thesis develops an artificial intelligence for a mobile robot to learn from historical or forecasted data of environmental energy available in the area of interest which will help for a persistence monitoring under uncertainty using the developed algorithm.
Resumo:
A crucial issue with hybrid quantum secret sharing schemes is the amount of data that is allocated to the participants. The smaller the amount of allocated data, the better the performance of a scheme. Moreover, quantum data is very hard and expensive to deal with, therefore, it is desirable to use as little quantum data as possible. To achieve this goal, we first construct extended unitary operations by the tensor product of n, n ≥ 2, basic unitary operations, and then by using those extended operations, we design two quantum secret sharing schemes. The resulting dual compressible hybrid quantum secret sharing schemes, in which classical data play a complementary role to quantum data, range from threshold to access structure. Compared with the existing hybrid quantum secret sharing schemes, our proposed schemes not only reduce the number of quantum participants, but also the number of particles and the size of classical shares. To be exact, the number of particles that are used to carry quantum data is reduced to 1 while the size of classical secret shares also is also reduced to l−2 m−1 based on ((m+1, n′)) threshold and to l−2 r2 (where r2 is the number of maximal unqualified sets) based on adversary structure. Consequently, our proposed schemes can greatly reduce the cost and difficulty of generating and storing EPR pairs and lower the risk of transmitting encoded particles.
Resumo:
This paper addresses less recognised factors which influence the diffusion of a particular technology. While an innovation’s attributes and performance are paramount, many fail because of external factors which favour an alternative. This paper, with theoretic input from diffusion, lock-in and path-dependency, presents a qualitative study of external factors that influenced the evolution of transportation in USA. This historical account reveals how one technology and its emergent systems become dominant while other choices are overridden by socio-political, economic and technological interests which include not just the manufacturing and service industries associated with the automobile but also government and market stakeholders. Termed here as a large socio-economic regime (LSER),its power in ensuring lock-in and continued path-dependency is shown to pass through three stages, weakening eventually as awareness improves. The study extends to transport trends in China, Korea, Indonesia and Malaysia and they all show the dominant role of an LSER. As transportation policy is increasingly accountable to address both demand and environmental concerns and innovators search for solutions, this paper presents important knowledge for innovators, marketers and policy makers for commercial and societal reasons, especially when negative externalities associated with an incumbent transportation technology may lead to market failure.
Resumo:
Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.
The dual nature of information systems in enabling a new wave of hardware ventures: Towards a theory
Resumo:
Hardware ventures are emerging entrepreneurial firms that create new market offerings based on development of digital devices. These ventures are important elements in the global economy but have not yet received much attention in the literature. Our interest in examining hardware ventures is specifically in the role that information system (IS) resources play in enabling them. We ask how the role of IS resources for hardware ventures can be conceptualized and develop a framework for assessment. Our framework builds on the distinction of operand and operant resources and distinguishes between two key lifecycle stages of hardware ventures: start-up and growth. We show how this framework can be used to discuss the role, nature, and use of IS for hardware ventures and outline empirical research strategies that flow from it. Our work contributes to broadening and enriching the IS field by drawing attention to its role in significant and novel phenomena.
Resumo:
Brain connectivity analyses are increasingly popular for investigating organization. Many connectivity measures including path lengths are generally defined as the number of nodes traversed to connect a node in a graph to the others. Despite its name, path length is purely topological, and does not take into account the physical length of the connections. The distance of the trajectory may also be highly relevant, but is typically overlooked in connectivity analyses. Here we combined genotyping, anatomical MRI and HARDI to understand how our genes influence the cortical connections, using whole-brain tractography. We defined a new measure, based on Dijkstra's algorithm, to compute path lengths for tracts connecting pairs of cortical regions. We compiled these measures into matrices where elements represent the physical distance traveled along tracts. We then analyzed a large cohort of healthy twins and show that our path length measure is reliable, heritable, and influenced even in young adults by the Alzheimer's risk gene, CLU.
Resumo:
This paper presents data on residents’ use of common stairways and lifts (vertical circulation spaces) in multi-storey apartment buildings (MSABs) in Brisbane, Australia. Vertical movement is a defining aspect of multi-storey living and the energy consumed by lifts contributes significantly to the energy budget of the typical MSAB. The purpose is to investigate whether a reappraisal of vertical circulation design, through the lens of residents’ requirements, might contribute to energy reductions in this building type. Data was gathered on a theoretical sample of MSAB ranging from five decades old to very recent schemes. 90 residents were surveyed about their day-to-day experiences of circulation and access systems. The results showed that residents mainly chose to use the stairs for convenience and exercise. Building management regimes that limited residents’ access to collective spaces were the main impediment to discretionary stair use. Only two buildings did not have fully enclosed stairwells and these had the highest stair usage, suggesting that stair design, and building governance are two areas that might be worthy of attention. The more that circulation design is focussed on limiting access, the less opportunities there are for personal choice, incidental social interaction and casual surveillance of collective spaces. The more that design of vertical circulation spaces in MSAB meets residents’ needs the less likely they are to be reliant on continuous energy supply for normal functioning.
Resumo:
The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.