882 resultados para Branch and bound algorithms
Resumo:
Electronic services are a leitmotif in ‘hot’ topics like Software as a Service, Service Oriented Architecture (SOA), Service oriented Computing, Cloud Computing, application markets and smart devices. We propose to consider these in what has been termed the Service Ecosystem (SES). The SES encompasses all levels of electronic services and their interaction, with human consumption and initiation on its periphery in much the same way the ‘Web’ describes a plethora of technologies that eventuate to connect information and expose it to humans. Presently, the SES is heterogeneous, fragmented and confined to semi-closed systems. A key issue hampering the emergence of an integrated SES is Service Discovery (SD). A SES will be dynamic with areas of structured and unstructured information within which service providers and ‘lay’ human consumers interact; until now the two are disjointed, e.g., SOA-enabled organisations, industries and domains are choreographed by domain experts or ‘hard-wired’ to smart device application markets and web applications. In a SES, services are accessible, comparable and exchangeable to human consumers closing the gap to the providers. This requires a new SD with which humans can discover services transparently and effectively without special knowledge or training. We propose two modes of discovery, directed search following an agenda and explorative search, which speculatively expands knowledge of an area of interest by means of categories. Inspired by conceptual space theory from cognitive science, we propose to implement the modes of discovery using concepts to map a lay consumer’s service need to terminologically sophisticated descriptions of services. To this end, we reframe SD as an information retrieval task on the information attached to services, such as, descriptions, reviews, documentation and web sites - the Service Information Shadow. The Semantic Space model transforms the shadow's unstructured semantic information into a geometric, concept-like representation. We introduce an improved and extended Semantic Space including categorization calling it the Semantic Service Discovery model. We evaluate our model with a highly relevant, service related corpus simulating a Service Information Shadow including manually constructed complex service agendas, as well as manual groupings of services. We compare our model against state-of-the-art information retrieval systems and clustering algorithms. By means of an extensive series of empirical evaluations, we establish optimal parameter settings for the semantic space model. The evaluations demonstrate the model’s effectiveness for SD in terms of retrieval precision over state-of-the-art information retrieval models (directed search) and the meaningful, automatic categorization of service related information, which shows potential to form the basis of a useful, cognitively motivated map of the SES for exploratory search.
Resumo:
The ability to perform autonomous emergency (forced) landings is one of the key technology enablers identified for UAS. This paper presents the flight test results of forced landings involving a UAS, in a controlled environment, and which was conducted to ascertain the performances of previously developed (and published) path planning and guidance algorithms. These novel 3-D nonlinear algorithms have been designed to control the vehicle in both the lateral and longitudinal planes of motion. These algorithms have hitherto been verified in simulation. A modified Boomerang 60 RC aircraft is used as the flight test platform, with associated onboard and ground support equipment sourced Off-the-Shelf or developed in-house at the Australian Research Centre for Aerospace Automation (ARCAA). HITL simulations were conducted prior to the flight tests and displayed good landing performance, however, due to certain identified interfacing errors, the flight results differed from that obtained in simulation. This paper details the lessons learnt and presents a plausible solution for the way forward.
Resumo:
The Texas Department of Transportation (TxDOT) is concerned about the widening gap between pavement preservation needs and available funding. Thus, the TxDOT Austin District Pavement Engineer (DPE) has investigated methods to strategically allocate available pavement funding to potential projects that improve the overall performance of the District and Texas highway systems. The primary objective of the study presented in this paper is to develop a network-level project screening and ranking method that supports the Austin District 4-year pavement management plan development. The study developed candidate project selection and ranking algorithms that evaluated pavement conditions of each project candidate using data contained in the Pavement Management Information system (PMIS) database and incorporated insights from Austin District pavement experts; and implemented the developed method and supporting algorithm. This process previously required weeks to complete, but now requires about 10 minutes including data preparation and running the analysis algorithm, which enables the Austin DPE to devote more time and resources to conducting field visits, performing project-level evaluation and testing candidate projects. The case study results showed that the proposed method assisted the DPE in evaluating and prioritizing projects and allocating funds to the right projects at the right time.
Resumo:
With the advent of large-scale wind farms and their integration into electrical grids, more uncertainties, constraints and objectives must be considered in power system development. It is therefore necessary to introduce risk-control strategies into the planning of transmission systems connected with wind power generators. This paper presents a probability-based multi-objective model equipped with three risk-control strategies. The model is developed to evaluate and enhance the ability of the transmission system to protect against overload risks when wind power is integrated into the power system. The model involves: (i) defining the uncertainties associated with wind power generators with probability measures and calculating the probabilistic power flow with the combined use of cumulants and Gram-Charlier series; (ii) developing three risk-control strategies by specifying the smallest acceptable non-overload probability for each branch and the whole system, and specifying the non-overload margin for all branches in the whole system; (iii) formulating an overload risk index based on the non-overload probability and the non-overload margin defined; and (iv) developing a multi-objective transmission system expansion planning (TSEP) model with the objective functions composed of transmission investment and the overload risk index. The presented work represents a superior risk-control model for TSEP in terms of security, reliability and economy. The transmission expansion planning model with the three risk-control strategies demonstrates its feasibility in the case study using two typical power systems
Resumo:
Corals inhabit high energy environments where frequent disturbances result in physical damage to coralla, including fragmentation, as well as generating and mobilizing large sediment clasts. The branching growth form common in the Acropora genus makes it particularly susceptible to such disturbances and therefore useful for study of the fate of large sediment clasts. Living Acropora samples with natural, extraneous, broken coral branches incorporated on their living surface and dead Acropora skeletons containing embedded clasts of isolated branch sections of Acropora were observed and/or collected from the reef flat of Heron Reef, southern Great Barrier Reef and Bargara, Australia respectively. Here we report three different outcomes when pebble-sized coral branches became lodged on living coral colonies during sedimentation events in natural settings in Acropora: 1) Where live coral branches produced during a disturbance event come to rest on probable genetic clone-mate colonies they become rapidly stabilised leading to complete soft tissue and skeletal fusion; 2) Where the branch and underlying colony are not clone-mates, but may still be the same or similar species, the branches still may be stabilised rapidly by soft tissue, but then one species will overgrow the other; and 3) Where branches represent dead skeletal debris, they are treated like any foreign clast and are surrounded by clypeotheca and incorporated into the corallum by overgrowth. The retention of branch fragments on colonies in high energy reef flat settings may suggest an active role of coral polyps to recognise and fuse with each other. Also, in all cases the healing of disturbed tissue and subsequent skeletal growth is an adaptation important for protecting colonies from invasion by parasites and other benthos following disturbance events and may also serve to increase corallum strength. Knowledge of such adaptations is important in studies of coral behaviour during periods of environmental stress.
Resumo:
The Clay Minerals Society Source Clay kaolinites, Georgia KGa-1 and KGa-2, have been subjected to particle size determinations by 1) conventional sedimentation methods, 2) electron microscopy and image analysis, and 3) laser scattering using improved algorithms for the interaction of light with small particles. Particle shape, size distribution, and crystallinity vary considerably for each kaolinite. Replicate analyses of separated size fractions showed that in the <2 µm range, the sedimentation/centrifugation method of Tanner and Jackson (1947) is reproducible for different kaolinite types and that the calculated size ranges are in reasonable agreement with the size bins estimated from laser scattering. Particle sizes determined by laser scattering must be calculated using Mie theory when the dominant particle size is less than ∼5 µm. Based on this study of two well-known and structurally different kaolinites, laser scattering, with improved data reduction algorithms that include Mie theory, should be considered an internally consistent and rapid technique for clay particle sizing.
Resumo:
Linear adaptive channel equalization using the least mean square (LMS) algorithm and the recursive least-squares(RLS) algorithm for an innovative multi-user (MU) MIMOOFDM wireless broadband communications system is proposed. The proposed equalization method adaptively compensates the channel impairments caused by frequency selectivity in the propagation environment. Simulations for the proposed adaptive equalizer are conducted using a training sequence method to determine optimal performance through a comparative analysis. Results show an improvement of 0.15 in BER (at a SNR of 16 dB) when using Adaptive Equalization and RLS algorithm compared to the case in which no equalization is employed. In general, adaptive equalization using LMS and RLS algorithms showed to be significantly beneficial for MU-MIMO-OFDM systems.
Resumo:
Recently there has been significant interest of researchers and practitioners on the use of Bluetooth as a complementary transport data. However, literature is limited with the understanding of the Bluetooth MAC Scanner (BMS) based data acquisition process and the properties of the data being collected. This paper first provides an insight on the BMS data acquisition process. Thereafter, it discovers the interesting facts from analysis of the real BMS data from both motorway and arterial networks of Brisbane, Australia. The knowledge gained is helpful for researchers and practitioners to understand the BMS data being collected which is vital to the development of management and control algorithms using the data.
Resumo:
Purpose This article reports on a research project that explored social media best practice in the public library sector. Design/methodology/approach The primary research approach for the project was case study. Two organisations participated in case studies that involved interviews, document analysis, and social media observation. Findings The two case study organisations use social media effectively to facilitate participatory networks, however, there have been challenges surrounding its implementation in both organisations. Challenges include negotiating requirements of governing bodies and broader organisational environments, and managing staff reluctance around the implementations. As social media use continues to grow and libraries continue to take up new platforms, social media must be considered to be another service point of the virtual branch, and indeed, for the library service as a whole. This acceptance of social media as being core business is critical to the successful implementation of social media based activities. Practical implications The article provides an empirically grounded discussion of best practice and the conditions that support it. The findings are relevant for information organisations across all sectors and could inform the development of policy and practice in other organisations. This paper contributes to the broader dialogue around best practice in participatory service delivery and social media use in library and information organisations.
Resumo:
Field robots often rely on laser range finders (LRFs) to detect obstacles and navigate autonomously. Despite recent progress in sensing technology and perception algorithms, adverse environmental conditions, such as the presence of smoke, remain a challenging issue for these robots. In this paper, we investigate the possibility to improve laser-based perception applications by anticipating situations when laser data are affected by smoke, using supervised learning and state-of-the-art visual image quality analysis. We propose to train a k-nearest-neighbour (kNN) classifier to recognise situations where a laser scan is likely to be affected by smoke, based on visual data quality features. This method is evaluated experimentally using a mobile robot equipped with LRFs and a visual camera. The strengths and limitations of the technique are identified and discussed, and we show that the method is beneficial if conservative decisions are the most appropriate.
Resumo:
This paper presents large, accurately calibrated and time-synchronised datasets, gathered outdoors in controlled environmental conditions, using an unmanned ground vehicle (UGV), equipped with a wide variety of sensors. It discusses how the data collection process was designed, the conditions in which these datasets have been gathered, and some possible outcomes of their exploitation, in particular for the evaluation of performance of sensors and perception algorithms for UGVs.
Resumo:
Voltage drop at network peak hours is a significant power quality problem in Low Voltage (LV) distribution feeders. Recently, voltage rise due to high penetration of Photovoltaic cells (PVs) has been creating a new power quality problem during noon periods. In this paper, a voltage control strategy is proposed for the household installed PVs to regulate the voltage along the LV feeder. For this purpose, each PV is controlled to exchange reactive power with the grid. A droop control method is utilized to coordinate the reactive power exchange of each PV. The proposed method is a decentralized local voltage support since it is based on only local measurements and does not require any communication with other PVs. The required converter and filter structure and control algorithms are proposed to ensure the dynamic performance of the system. The study focuses on 3-phase PVs. The network is studied at network peak and off-peak periods, separately. The efficacy of the proposed voltage support concept is verified through numerical and dynamic analyses with MATLAB and PSCAD/EMTDC.
Resumo:
This paper presents a new hybrid evolutionary algorithm based on Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO) for daily Volt/Var control in distribution system including Distributed Generators (DGs). Due to the small X/R ratio and radial configuration of distribution systems, DGs have much impact on this problem. Since DGs are independent power producers or private ownership, a price based methodology is proposed as a proper signal to encourage owners of DGs in active power generation. Generally, the daily Volt/Var control is a nonlinear optimization problem. Therefore, an efficient hybrid evolutionary method based on Particle Swarm Optimization and Ant Colony Optimization (ACO), called HPSO, is proposed to determine the active power values of DGs, reactive power values of capacitors and tap positions of transformers for the next day. The feasibility of the proposed algorithm is demonstrated and compared with methods based on the original PSO, ACO and GA algorithms on IEEE 34-bus distribution feeder.
Resumo:
RC4(n, m) is a stream cipher based on RC4 and is designed by G. Gong et al. It can be seen as a generalization of the famous RC4 stream cipher designed by Ron Rivest. The authors of RC4(n, m) claim that the cipher resists all the attacks that are successful against the original RC4. The paper reveals cryptographic weaknesses of the RC4(n, m) stream cipher. We develop two attacks. The first one is based on non-randomness of internal state and allows to distinguish it from a truly random cipher by an algorithm that has access to 24·n bits of the keystream. The second attack exploits low diffusion of bits in the KSA and PRGA algorithms and recovers all bytes of the secret key. This attack works only if the initial value of the cipher can be manipulated. Apart from the secret key, the cipher uses two other inputs, namely, initial value and initial vector. Although these inputs are fixed in the cipher specification, some applications may allow the inputs to be under the attacker control. Assuming that the attacker can control the initial value, we show a distinguisher for the cipher and a secret key recovery attack that for the L-bit secret key, is able to recover it with about (L/n) · 2n steps. The attack has been implemented on a standard PC and can reconstruct the secret key of RC(8, 32) in less than a second.
Resumo:
We present a text watermarking scheme that embeds a bitstream watermark Wi in a text document P preserving the meaning, context, and flow of the document. The document is viewed as a set of paragraphs, each paragraph being a set of sentences. The sequence of paragraphs and sentences used to embed watermark bits is permuted using a secret key. Then, English language sentence transformations are used to modify sentence lengths, thus embedding watermarking bits in the Least Significant Bits (LSB) of the sentences’ cardinalities. The embedding and extracting algorithms are public, while the secrecy and security of the watermark depends on a secret key K. The probability of False Positives is extremely small, hence avoiding incidental occurrences of our watermark in random text documents. Majority voting provides security against text addition, deletion, and swapping attacks, further reducing the probability of False Positives. The scheme is secure against the general attacks on text watermarks such as reproduction (photocopying, FAX), reformatting, synonym substitution, text addition, text deletion, text swapping, paragraph shuffling and collusion attacks.