357 resultados para Proximal Point Algorithm
Resumo:
This paper presents practical vision-based collision avoidance for objects approximating a single point feature. Using a spherical camera model, a visual predictive control scheme guides the aircraft around the object along a conical spiral trajectory. Visibility, state and control constraints are considered explicitly in the controller design by combining image and vehicle dynamics in the process model, and solving the nonlinear optimization problem over the resulting state space. Importantly, range is not required. Instead, the principles of conical spiral motion are used to design an objective function that simultaneously guides the aircraft along the avoidance trajectory, whilst providing an indication of the appropriate point to stop the spiral behaviour. Our approach is aimed at providing a potential solution to the See and Avoid problem for unmanned aircraft and is demonstrated through a series.
Resumo:
Flood flows in inundated urban environment constitute a natural hazard. During the 12- 13 January 2011 flood of the Brisbane River, detailed water elevation, velocity and suspended sediment data were recorded in an inundated street at the peak of the flood. The field observations highlighted a number of unusual flow interactions with the urban surroundings. These included some slow fluctuations in water elevations and velocity with distinctive periods between 50 and 100 s caused by some local topographic effect (choking), superposed with some fast turbulent fluctuations. The suspended sediment data highlighted some significant suspended sediment loads in the inundated zone.
Resumo:
In this paper, we propose a semi-supervised approach of anomaly detection in Online Social Networks. The social network is modeled as a graph and its features are extracted to detect anomaly. A clustering algorithm is then used to group users based on these features and fuzzy logic is applied to assign degree of anomalous behavior to the users of these clusters. Empirical analysis shows effectiveness of this method.
Resumo:
In the real world there are many problems in network of networks (NoNs) that can be abstracted to a so-called minimum interconnection cut problem, which is fundamentally different from those classical minimum cut problems in graph theory. Thus, it is desirable to propose an efficient and effective algorithm for the minimum interconnection cut problem. In this paper we formulate the problem in graph theory, transform it into a multi-objective and multi-constraint combinatorial optimization problem, and propose a hybrid genetic algorithm (HGA) for the problem. The HGA is a penalty-based genetic algorithm (GA) that incorporates an effective heuristic procedure to locally optimize the individuals in the population of the GA. The HGA has been implemented and evaluated by experiments. Experimental results have shown that the HGA is effective and efficient.
Resumo:
Energy prices are highly volatile and often feature unexpected spikes. It is the aim of this paper to examine whether the occurrence of these extreme price events displays any regularities that can be captured using an econometric model. Here we treat these price events as point processes and apply Hawkes and Poisson autoregressive models to model the dynamics in the intensity of this process.We use load and meteorological information to model the time variation in the intensity of the process. The models are applied to data from the Australian wholesale electricity market, and a forecasting exercise illustrates both the usefulness of these models and their limitations when attempting to forecast the occurrence of extreme price events.
Resumo:
Distributed Wireless Smart Camera (DWSC) network is a special type of Wireless Sensor Network (WSN) that processes captured images in a distributed manner. While image processing on DWSCs sees a great potential for growth, with its applications possessing a vast practical application domain such as security surveillance and health care, it suffers from tremendous constraints. In addition to the limitations of conventional WSNs, image processing on DWSCs requires more computational power, bandwidth and energy that presents significant challenges for large scale deployments. This dissertation has developed a number of algorithms that are highly scalable, portable, energy efficient and performance efficient, with considerations of practical constraints imposed by the hardware and the nature of WSN. More specifically, these algorithms tackle the problems of multi-object tracking and localisation in distributed wireless smart camera net- works and optimal camera configuration determination. Addressing the first problem of multi-object tracking and localisation requires solving a large array of sub-problems. The sub-problems that are discussed in this dissertation are calibration of internal parameters, multi-camera calibration for localisation and object handover for tracking. These topics have been covered extensively in computer vision literatures, however new algorithms must be invented to accommodate the various constraints introduced and required by the DWSC platform. A technique has been developed for the automatic calibration of low-cost cameras which are assumed to be restricted in their freedom of movement to either pan or tilt movements. Camera internal parameters, including focal length, principal point, lens distortion parameter and the angle and axis of rotation, can be recovered from a minimum set of two images of the camera, provided that the axis of rotation between the two images goes through the camera's optical centre and is parallel to either the vertical (panning) or horizontal (tilting) axis of the image. For object localisation, a novel approach has been developed for the calibration of a network of non-overlapping DWSCs in terms of their ground plane homographies, which can then be used for localising objects. In the proposed approach, a robot travels through the camera network while updating its position in a global coordinate frame, which it broadcasts to the cameras. The cameras use this, along with the image plane location of the robot, to compute a mapping from their image planes to the global coordinate frame. This is combined with an occupancy map generated by the robot during the mapping process to localised objects moving within the network. In addition, to deal with the problem of object handover between DWSCs of non-overlapping fields of view, a highly-scalable, distributed protocol has been designed. Cameras that follow the proposed protocol transmit object descriptions to a selected set of neighbours that are determined using a predictive forwarding strategy. The received descriptions are then matched at the subsequent camera on the object's path using a probability maximisation process with locally generated descriptions. The second problem of camera placement emerges naturally when these pervasive devices are put into real use. The locations, orientations, lens types etc. of the cameras must be chosen in a way that the utility of the network is maximised (e.g. maximum coverage) while user requirements are met. To deal with this, a statistical formulation of the problem of determining optimal camera configurations has been introduced and a Trans-Dimensional Simulated Annealing (TDSA) algorithm has been proposed to effectively solve the problem.
Resumo:
In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.
Resumo:
Bioacoustic data can provide an important base for environmental monitoring. To explore a large amount of field recordings collected, an automated similarity search algorithm is presented in this paper. A region of an audio defined by frequency and time bounds is provided by a user; the content of the region is used to construct a query. In the retrieving process, our algorithm will automatically scan through recordings to search for similar regions. In detail, we present a feature extraction approach based on the visual content of vocalisations – in this case ridges, and develop a generic regional representation of vocalisations for indexing. Our feature extraction method works best for bird vocalisations showing ridge characteristics. The regional representation method allows the content of an arbitrary region of a continuous recording to be described in a compressed format.
Resumo:
Many grid connected PV installations consist of a single series string of PV modules and a single DC-AC inverter. This efficiency of this topology can be enhanced with additional low power, low cost per panel converter modules. Most current flows directly in the series string which ensures high efficiency. However parallel Cúk or buck-boost DC-DC converters connected across each adjacent pair of modules now support any desired current difference between series connected PV modules. Each converter “shuffles” the desired difference in PV module currents between two modules and so on up the string. Spice simulations show that even with poor efficiency, these modules can make a significant improvement to the overall power which can be recovered from partially shaded PV strings.
Resumo:
Abstract An experimental dataset representing a typical flow field in a stormwater gross pollutant trap (GPT) was visualised. A technique was developed to apply the image-based flow visualisation (IBFV) algorithm to the raw dataset. Particle image velocimetry (PIV) software was previously used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding stormwater pollutant capture and retention behaviour within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate the possible flow paths of pollutants entering the GPT. The investigated flow paths were compared with the behaviour of pollutants monitored during experiments.
Resumo:
An Application Specific Instruction-set Processor (ASIP) is a specialized processor tailored to run a particular application/s efficiently. However, when there are multiple candidate applications in the application’s domain it is difficult and time consuming to find optimum set of applications to be implemented. Existing ASIP design approaches perform this selection manually based on a designer’s knowledge. We help in cutting down the number of candidate applications by devising a classification method to cluster similar applications based on the special-purpose operations they share. This provides a significant reduction in the comparison overhead while resulting in customized ASIP instruction sets which can benefit a whole family of related applications. Our method gives users the ability to quantify the degree of similarity between the sets of shared operations to control the size of clusters. A case study involving twelve algorithms confirms that our approach can successfully cluster similar algorithms together based on the similarity of their component operations.
Resumo:
Several fringing coral reefs in Moreton Bay, Southeast Queensland, some 300 km south of the Great Barrier Reef (GBR), are set in a relatively high latitude, estuarine environment that is considered marginal for coral growth. Previous work indicated that these marginal reefs, as with many fringing reefs of the inner GBR, ceased accreting in the mid-Holocene. This research presents for the first time data from the subsurface profile of the mid-Holocene fossil reef at Wellington Point comprising U/Th dates of in situ and framework corals, and trace element analysis from the age constrained carbonate fragments. Based on trace element proxies the palaeo-water quality during reef accretion was reconstructed. Results demonstrate that the reef initiated more than 7,000 yr BP during the post glacial transgression, and the initiation progressed to the west as sea level rose. In situ micro-atolls indicate that sea level was at least 1 m above present mean sea level by 6,680 years ago. The reef remained in "catch-up" mode, with a seaward sloping upper surface, until it stopped aggrading abruptly at ca 6,000 yr BP; no lateral progradation occurred. Changes in sediment composition encountered in the cores suggest that after the laterite substrate was covered by the reef, most of the sediment was produced by the carbonate factory with minimal terrigenous influence. Rare earth element, Y and Ba proxies indicate that water quality during reef accretion was similar to oceanic waters, considered suitable for coral growth. A slight decline in water quality on the basis of increased Ba in the later stages of growth may be related to increased riverine input and partial closing up of the bay due to either tidal delta progradation, climatic change and/or slight sea level fall. The age data suggest that termination of reef growth coincided with a slight lowering of sea level, activation of ENSO and consequent increase in seasonality, lowering of temperatures and the constrictions to oceanic flushing. At the cessation of reef accretion the environmental conditions in the western Moreton Bay were changing from open marine to estuarine. The living coral community appears to be similar to the fossil community, but without the branching Acropora spp. that were more common in the fossil reef. In this marginal setting coral growth periods do not always correspond to periods of reef accretion due to insufficient coral abundance. Due to several environmental constraints modern coral growth is insufficient for reef growth. Based on these findings Moreton Bay may be unsuitable as a long term coral refuge for most species currently living in the GBR.
Resumo:
The mammalian target of rapamycin (mTOR) is a highly conserved atypical serine-threonine kinase that controls numerous functions essential for cell homeostasis and adaptation in mammalian cells via 2 distinct protein complex formations. Moreover, mTOR is a key regulatory protein in the insulin signalling cascade and has also been characterized as an insulin-independent nutrient sensor that may represent a critical mediator in obesity-related impairments of insulin action in skeletal muscle. Exercise characterizes a remedial modality that enhances mTOR activity and subsequently promotes beneficial metabolic adaptation in skeletal muscle. Thus, the metabolic effects of nutrients and exercise have the capacity to converge at the mTOR protein complexes and subsequently modify mTOR function. Accordingly, the aim of the present review is to highlight the role of mTOR in the regulation of insulin action in response to overnutrition and the capacity for exercise to enhance mTOR activity in skeletal muscle.
Resumo:
The paper introduces the design of robust current and voltage control algorithms for a grid-connected three-phase inverter which is interfaced to the grid through a high-bandwidth three-phase LCL filter. The algorithms are based on the state feedback control which have been designed in a systematic approach and improved by using oversampling to deal with the issues arising due to the high-bandwidth filter. An adaptive loop delay compensation method has also been adopted to minimize the adverse effects of loop delay in digital controller and to increase the robustness of the control algorithm in the presence of parameter variations. Simulation results are presented to validate the effectiveness of the proposed algorithm.
Resumo:
Multi-Objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the thermoeconomic and Environmental aspects have been considered, simultaneously. The environmental objective function has been defined and expressed in cost terms. One of the most suitable optimization techniques developed using a particular class of search algorithms known as; Multi-Objective Particle Swarm Optimization (MOPSO) algorithm has been used here. This approach has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of fuzzy decision-making with the aid of Bellman-Zadeh approach has been presented and a final optimal solution has been introduced.