335 resultados para well-structured transition systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surveillance systems such as object tracking and abandoned object detection systems typically rely on a single modality of colour video for their input. These systems work well in controlled conditions but often fail when low lighting, shadowing, smoke, dust or unstable backgrounds are present, or when the objects of interest are a similar colour to the background. Thermal images are not affected by lighting changes or shadowing, and are not overtly affected by smoke, dust or unstable backgrounds. However, thermal images lack colour information which makes distinguishing between different people or objects of interest within the same scene difficult. ----- By using modalities from both the visible and thermal infrared spectra, we are able to obtain more information from a scene and overcome the problems associated with using either modality individually. We evaluate four approaches for fusing visual and thermal images for use in a person tracking system (two early fusion methods, one mid fusion and one late fusion method), in order to determine the most appropriate method for fusing multiple modalities. We also evaluate two of these approaches for use in abandoned object detection, and propose an abandoned object detection routine that utilises multiple modalities. To aid in the tracking and fusion of the modalities we propose a modified condensation filter that can dynamically change the particle count and features used according to the needs of the system. ----- We compare tracking and abandoned object detection performance for the proposed fusion schemes and the visual and thermal domains on their own. Testing is conducted using the OTCBVS database to evaluate object tracking, and data captured in-house to evaluate the abandoned object detection. Our results show that significant improvement can be achieved, and that a middle fusion scheme is most effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phenomenon of organizations offering service bundles can typically be observed in dynamic markets with heterogeneous customer demand. Available literature broaching the issue of service bundling covers strategic considerations for organizations related to their respective market position as well as their pricing options for different bundle configurations. However, little guidance can be found regarding the identification of potential bundle candidates and the actual process of bundling. In this paper, we present an approach to service bundling that can be utilized by organizations to identify services that are suitable for bundling. The contribution of the paper is twofold. Firstly, the proposed method represents a structured conceptualization approach for organizations to facilitate the creation of bundles in practice based on empirical findings. Secondly, from a Design Science research perspective, the proposed method represents an innovative artifact that extends the academic knowledge base related to service management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Creating sustainable urban environments is one of the challenging issues that need a clear vision and implementation strategies involving changes in governmental values and decision making process for local governments. Particularly, internalisation of environmental externalities of daily urban activities (e.g. manufacturing, transportation and so on) has immense importance for which local policies are formulated to provide better living conditions for the people inhabiting urban areas. Even if environmental problems are defined succinctly by various stakeholders, complicated nature of sustainability issues demand a structured evaluation strategy and well-defined sustainability parameters for efficient and effective policy making. Following this reasoning, this study involves assessment of sustainability performance of urban settings mainly focusing on environmental problems caused by rapid urban expansion and transformation. By taking into account land-use and transportation interaction, it tries to reveal how future urban developments would alter daily urban travel behaviour of people and affect the urban and natural environments. The paper introduces a grid-based indexing method developed for this research and trailed as a GIS-based decision support tool to analyse and model selected spatial and aspatial indicators of sustainability in the Gold Coast. This process reveals parameters of site specific relationship among selected indicators that are used to evaluate index-based performance characteristics of the area. The evaluation is made through an embedded decision support module by assigning relative weights to indicators. Resolution of selected grid-based unit of analysis provides insights about service level of projected urban development proposals at a disaggregate level, such as accessibility to transportation and urban services, and pollution. The paper concludes by discussing the findings including the capacity of the decision support system to assist decision-makers in determining problematic areas and developing intervention policies for sustainable outcomes of future developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Offering service bundles to the market is a promising option for service providers to strengthen their competitive advantages, cope with dynamic market conditions and deal with heterogeneous consumer demand. Although the expected positive effects of bundling strategies and pricing considerations for bundles are covered well by the available literature, limited guidance can be found regarding the identification of potential bundle candidates and the actual process of bundling. The contribution of this paper is the positioning of bundling based on insights from both business and computer science and the proposition of a structured bundling method, which guides organizations with the composition of bundles in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Offering service bundles to the market is a promising option for service providers to strengthen their competitive advantages, cope with dynamic market conditions and deal with heterogeneous consumer demand. Although the expected positive effects of bundling strategies and pricing considerations for bundles are covered well by the available literature, limited guidance can be found regarding the identification of potential bundle candidates and the actual process of bundling. The proposed research aims at filling this gap by offering a service bundling method complemented by a proof-of-concept prototype, which extends the existing knowledge base in the multidisciplinary research area of Information Systems and Service Science as well as providing an organisation with a structured approach for bundling services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach for preprocessing systems of polynomial equations via graph partitioning. The variable-sharing graph of a system of polynomial equations is defined. If such graph is disconnected, then the corresponding system of equations can be split into smaller ones that can be solved individually. This can provide a tremendous speed-up in computing the solution to the system, but is unlikely to occur either randomly or in applications. However, by deleting certain vertices on the graph, the variable-sharing graph could be disconnected in a balanced fashion, and in turn the system of polynomial equations would be separated into smaller systems of near-equal sizes. In graph theory terms, this process is equivalent to finding balanced vertex partitions with minimum-weight vertex separators. The techniques of finding these vertex partitions are discussed, and experiments are performed to evaluate its practicality for general graphs and systems of polynomial equations. Applications of this approach in algebraic cryptanalysis on symmetric ciphers are presented: For the QUAD family of stream ciphers, we show how a malicious party can manufacture conforming systems that can be easily broken. For the stream ciphers Bivium and Trivium, we nachieve significant speedups in algebraic attacks against them, mainly in a partial key guess scenario. In each of these cases, the systems of polynomial equations involved are well-suited to our graph partitioning method. These results may open a new avenue for evaluating the security of symmetric ciphers against algebraic attacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

World economies increasingly demand reliable and economical power supply and distribution. To achieve this aim the majority of power systems are becoming interconnected, with several power utilities supplying the one large network. One problem that occurs in a large interconnected power system is the regular occurrence of system disturbances which can result in the creation of intra-area oscillating modes. These modes can be regarded as the transient responses of the power system to excitation, which are generally characterised as decaying sinusoids. For a power system operating ideally these transient responses would ideally would have a “ring-down” time of 10-15 seconds. Sometimes equipment failures disturb the ideal operation of power systems and oscillating modes with ring-down times greater than 15 seconds arise. The larger settling times associated with such “poorly damped” modes cause substantial power flows between generation nodes, resulting in significant physical stresses on the power distribution system. If these modes are not just poorly damped but “negatively damped”, catastrophic failures of the system can occur. To ensure system stability and security of large power systems, the potentially dangerous oscillating modes generated from disturbances (such as equipment failure) must be quickly identified. The power utility must then apply appropriate damping control strategies. In power system monitoring there exist two facets of critical interest. The first is the estimation of modal parameters for a power system in normal, stable, operation. The second is the rapid detection of any substantial changes to this normal, stable operation (because of equipment breakdown for example). Most work to date has concentrated on the first of these two facets, i.e. on modal parameter estimation. Numerous modal parameter estimation techniques have been proposed and implemented, but all have limitations [1-13]. One of the key limitations of all existing parameter estimation methods is the fact that they require very long data records to provide accurate parameter estimates. This is a particularly significant problem after a sudden detrimental change in damping. One simply cannot afford to wait long enough to collect the large amounts of data required for existing parameter estimators. Motivated by this gap in the current body of knowledge and practice, the research reported in this thesis focuses heavily on rapid detection of changes (i.e. on the second facet mentioned above). This thesis reports on a number of new algorithms which can rapidly flag whether or not there has been a detrimental change to a stable operating system. It will be seen that the new algorithms enable sudden modal changes to be detected within quite short time frames (typically about 1 minute), using data from power systems in normal operation. The new methods reported in this thesis are summarised below. The Energy Based Detector (EBD): The rationale for this method is that the modal disturbance energy is greater for lightly damped modes than it is for heavily damped modes (because the latter decay more rapidly). Sudden changes in modal energy, then, imply sudden changes in modal damping. Because the method relies on data from power systems in normal operation, the modal disturbances are random. Accordingly, the disturbance energy is modelled as a random process (with the parameters of the model being determined from the power system under consideration). A threshold is then set based on the statistical model. The energy method is very simple to implement and is computationally efficient. It is, however, only able to determine whether or not a sudden modal deterioration has occurred; it cannot identify which mode has deteriorated. For this reason the method is particularly well suited to smaller interconnected power systems that involve only a single mode. Optimal Individual Mode Detector (OIMD): As discussed in the previous paragraph, the energy detector can only determine whether or not a change has occurred; it cannot flag which mode is responsible for the deterioration. The OIMD seeks to address this shortcoming. It uses optimal detection theory to test for sudden changes in individual modes. In practice, one can have an OIMD operating for all modes within a system, so that changes in any of the modes can be detected. Like the energy detector, the OIMD is based on a statistical model and a subsequently derived threshold test. The Kalman Innovation Detector (KID): This detector is an alternative to the OIMD. Unlike the OIMD, however, it does not explicitly monitor individual modes. Rather it relies on a key property of a Kalman filter, namely that the Kalman innovation (the difference between the estimated and observed outputs) is white as long as the Kalman filter model is valid. A Kalman filter model is set to represent a particular power system. If some event in the power system (such as equipment failure) causes a sudden change to the power system, the Kalman model will no longer be valid and the innovation will no longer be white. Furthermore, if there is a detrimental system change, the innovation spectrum will display strong peaks in the spectrum at frequency locations associated with changes. Hence the innovation spectrum can be monitored to both set-off an “alarm” when a change occurs and to identify which modal frequency has given rise to the change. The threshold for alarming is based on the simple Chi-Squared PDF for a normalised white noise spectrum [14, 15]. While the method can identify the mode which has deteriorated, it does not necessarily indicate whether there has been a frequency or damping change. The PPM discussed next can monitor frequency changes and so can provide some discrimination in this regard. The Polynomial Phase Method (PPM): In [16] the cubic phase (CP) function was introduced as a tool for revealing frequency related spectral changes. This thesis extends the cubic phase function to a generalised class of polynomial phase functions which can reveal frequency related spectral changes in power systems. A statistical analysis of the technique is performed. When applied to power system analysis, the PPM can provide knowledge of sudden shifts in frequency through both the new frequency estimate and the polynomial phase coefficient information. This knowledge can be then cross-referenced with other detection methods to provide improved detection benchmarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patterns of connectivity among local populations influence the dynamics of regional systems, but most ecological models have concentrated on explaining the effect of connectivity on local population structure using dynamic processes covering short spatial and temporal scales. In this study, a model was developed in an extended spatial system to examine the hypothesis that long term connectivity levels among local populations are influenced by the spatial distribution of resources and other habitat factors. The habitat heterogeneity model was applied to local wild rabbit populations in the semi-arid Mitchell region of southern central Queensland (the Eastern system). Species' specific population parameters which were appropriate for the rabbit in this region were used. The model predicted a wide range of long term connectivity levels among sites, ranging from the extreme isolation of some sites to relatively high interaction probabilities for others. The validity of model assumptions was assessed by regressing model output against independent population genetic data, and explained over 80% of the variation in the highly structured genetic data set. Furthermore, the model was robust, explaining a significant proportion of the variation in the genetic data over a wide range of parameters. The performance of the habitat heterogeneity model was further assessed by simulating the widely reported recent range expansion of the wild rabbit into the Mitchell region from the adjacent, panmictic Western rabbit population system. The model explained well the independently determined genetic characteristics of the Eastern system at different hierarchic levels, from site specific differences (for example, fixation of a single allele in the population at one site), to differences between population systems (absence of an allele in the Eastern system which is present in all Western system sites). The model therefore explained the past and long term processes which have led to the formation and maintenance of the highly structured Eastern rabbit population system. Most animals exhibit sex biased dispersal which may influence long term connectivity levels among local populations, and thus the dynamics of regional systems. When appropriate sex specific dispersal characteristics were used, the habitat heterogeneity model predicted substantially different interaction patterns between female-only and combined male and female dispersal scenarios. In the latter case, model output was validated using data from a bi-parentally inherited genetic marker. Again, the model explained over 80% of the variation in the genetic data. The fact that such a large proportion of variability is explained in two genetic data sets provides very good evidence that habitat heterogeneity influences long term connectivity levels among local rabbit populations in the Mitchell region for both males and females. The habitat heterogeneity model thus provides a powerful approach for understanding the large scale processes that shape regional population systems in general. Therefore the model has the potential to be useful as a tool to aid in the management of those systems, whether it be for pest management or conservation purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hydrodynamic environment “created” by bioreactors for the culture of a tissue engineered construct (TEC) is known to influence cell migration, proliferation and extra cellular matrix production. However, tissue engineers have looked at bioreactors as black boxes within which TECs are cultured mainly by trial and error, as the complex relationship between the hydrodynamic environment and tissue properties remains elusive, yet is critical to the production of clinically useful tissues. It is well known in the chemical and biotechnology field that a more detailed description of fluid mechanics and nutrient transport within process equipment can be achieved via the use of computational fluid dynamics (CFD) technology. Hence, the coupling of experimental methods and computational simulations forms a synergistic relationship that can potentially yield greater and yet, more cohesive data sets for bioreactor studies. This review aims at discussing the rationale of using CFD in bioreactor studies related to tissue engineering, as fluid flow processes and phenomena have direct implications on cellular response such as migration and/or proliferation. We conclude that CFD should be seen by tissue engineers as an invaluable tool allowing us to analyze and visualize the impact of fluidic forces and stresses on cells and TECs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Light Transport Systems (LTS) (e.g lightpipes, fibre optics) can illuminate core areas within buildings with great potential for energy savings. However, they do not provide a clear connection to the outside like windows do, and their effects on people’s physiological and psychological health are not well understood. Furthermore, how people perceive LTS affects users’ acceptance of the device and its performance. The purpose of this research is to understand how occupants perceive and experience spaces illuminated by LTS. Two case studies of commercial buildings with LTS, located in Brisbane, Australia are assessed by qualitative (focus group interviews) and quantitative (measurement of daylight illuminances and luminance) methods. The data from interviews with occupants provide useful insight into the aspects of LTS design that are most relevant to positive perception of the luminous environment. Luminance measurements of the occupied spaces support the perception of the LTS reported by occupants: designs that create high contrast luminous environments are more likely to be perceived negatively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces an event-based traffic model for railway systems adopting fixed-block signalling schemes. In this model, the events of trains' arrival at and departure from signalling blocks constitute the states of the traffic flow. A state transition is equivalent to the progress of the trains by one signalling block and it is realised by referring to past and present states, as well as a number of pre-calculated look-up tables of run-times in the signalling block under various signalling conditions. Simulation results are compared with those from a time-based multi-train simulator to study the improvement of processing time and accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The osteochondral defect is a classical model for a multiple-tissue problem[1]. Tissue engineering of either bone or cartilage imposes different demands on a scaffold concerning porosity, pore size and interconnectivity. Furthermore, local release of tissue-specific growth factors necessitates a tailored architecture. For the fabrication of an osteochondral scaffold with region specific architecture, an advanced technique is required. Stereolithography is a rapid prototyping technique that allows for the creation of such 3D polymer objects with well-defined architecture. Its working principle is the partial irradiation of a resin, causing a liquid-solid transition. By irradiating this resin by a computer-driven light source, a solid 3D object is constructed layer by layer. To make biodegradable polymers applicable in stereolithography, low-molecular weight polymers have to be functionalised with double bonds to enable photo-initiated crosslinking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Music is inherently active and interactive. Like technologies before them, digital systems provide a range of enhanced music performance opportunities. In this paper we outline the educational advantages of ensemble performance in which generative media systems are integrated. As a concrete example, we focus on our work with the jam2jam system which uses generative music processes to enhance collaborative music making. We suggest that our research points toward a new class of activities that maintain the well established benefits of ensemble performance while adding cultural and pedagogical value by leveraging the capabilities and cachet of digital media practices.