967 resultados para Real systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A distributed fuzzy system is a real-time fuzzy system in which the input, output and computation may be located on different networked computing nodes. The ability for a distributed software application, such as a distributed fuzzy system, to adapt to changes in the computing network at runtime can provide real-time performance improvement and fault-tolerance. This paper introduces an Adaptable Mobile Component Framework (AMCF) that provides a distributed dataflow-based platform with a fine-grained level of runtime reconfigurability. The execution location of small fragments (possibly as little as few machine-code instructions) of an AMCF application can be moved between different computing nodes at runtime. A case study is included that demonstrates the applicability of the AMCF to a distributed fuzzy system scenario involving multiple physical agents (such as autonomous robots). Using the AMCF, fuzzy systems can now be developed such that they can be distributed automatically across multiple computing nodes and are adaptable to runtime changes in the networked computing environment. This provides the opportunity to improve the performance of fuzzy systems deployed in scenarios where the computing environment is resource-constrained and volatile, such as multiple autonomous robots, smart environments and sensor networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process-aware information systems, ranging from generic workflow systems to dedicated enterprise information systems, use work-lists to offer so-called work items to users. In real scenarios, users can be confronted with a very large number of work items that stem from multiple cases of different processes. In this jungle of work items, users may find it hard to choose the right item to work on next. The system cannot autonomously decide which is the right work item, since the decision is also dependent on conditions that are somehow outside the system. For instance, what is “best” for an organisation should be mediated with what is “best” for its employees. Current work-list handlers show work items as a simple sorted list and therefore do not provide much decision support for choosing the right work item. Since the work-list handler is the dominant interface between the system and its users, it is worthwhile to provide an intuitive graphical interface that uses contextual information about work items and users to provide suggestions about prioritisation of work items. This paper uses the so-called map metaphor to visualise work items and resources (e.g., users) in a sophisticated manner. Moreover, based on distance notions, the work-list handler can suggest the next work item by considering different perspectives. For example, urgent work items of a type that suits the user may be highlighted. The underlying map and distance notions may be of a geographical nature (e.g., a map of a city or office building), but may also be based on process designs, organisational structures, social networks, due dates, calendars, etc. The framework proposed in this paper is generic and can be applied to any process-aware information system. Moreover, in order to show its practical feasibility, the paper discusses a full-fledged implementation developed in the context of the open-source workflow environment YAWL, together with two real examples stemming from two very different scenarios. The results of an initial usability evaluation of the implementation are also presented, which provide a first indication of the validity of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unmanned Aircraft Systems (UAS) describe a diverse range of aircraft that are operated without a human pilot on-board. Unmanned aircraft range from small rotorcraft, which can fit in the palm of your hand, through to fixed wing aircraft comparable in size to that of a commercial passenger jet. The absence of a pilot on-board allows these aircraft to be developed with unique performance capabilities facilitating a wide range of applications in surveillance, environmental management, agriculture, defence, and search and rescue. However, regulations relating to the safe design and operation of UAS first need to be developed before the many potential benefits from these applications can be realised. According to the International Civil Aviation Organization (ICAO), a Risk Management Process (RMP) should support all civil aviation policy and rulemaking activities (ICAO 2009). The RMP is described in International standard, ISO 31000:2009 (ISO, 2009a). This standard is intentionally generic and high-level, providing limited guidance on how it can be effectively applied to complex socio-technical decision problems such as the development of regulations for UAS. Through the application of principles and tools drawn from systems philosophy and systems engineering, this thesis explores how the RMP can be effectively applied to support the development of safety regulations for UAS. A sound systems-theoretic foundation for the RMP is presented in this thesis. Using the case-study scenario of a UAS operation over an inhabited area and through the novel application of principles drawn from general systems modelling philosophy, a consolidated framework of the definitions of the concepts of: safe, risk and hazard is made. The framework is novel in that it facilitates the representation of broader subjective factors in an assessment of the safety of a system; describes the issues associated with the specification of a system-boundary; makes explicit the hierarchical nature of the relationship between the concepts and the subsequent constraints that exist between them; and can be evaluated using a range of analytic or deliberative modelling techniques. Following the general sequence of the RMP, the thesis explores the issues associated with the quantified specification of safety criteria for UAS. A novel risk analysis tool is presented. In contrast to existing risk tools, the analysis tool presented in this thesis quantifiably characterises both the societal and individual risk of UAS operations as a function of the flight path of the aircraft. A novel structuring of the risk evaluation and risk treatment decision processes is then proposed. The structuring is achieved through the application of the Decision Support Problem Technique; a modelling approach that has been previously used to effectively model complex engineering design processes and to support decision-making in relation to airspace design. The final contribution made by this thesis is in the development of an airworthiness regulatory framework for civil UAS. A novel "airworthiness certification matrix" is proposed as a basis for the definition of UAS "Part 21" regulations. The outcome airworthiness certification matrix provides a flexible, systematic and justifiable method for promulgating airworthiness regulations for UAS. In addition, an approach for deriving "Part 1309" regulations for UAS is presented. In contrast to existing approaches, the approach presented in this thesis facilitates a traceable and objective tailoring of system-level reliability requirements across the diverse range of UAS operations. The significance of the research contained in this thesis is clearly demonstrated by its practical real world outcomes. Industry regulatory development groups and the Civil Aviation Safety Authority have endorsed the proposed airworthiness certification matrix. The risk models have also been used to support research undertaken by the Australian Department of Defence. Ultimately, it is hoped that the outcomes from this research will play a significant part in the shaping of regulations for civil UAS, here in Australia and around the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors present a Cause-Effect fault diagnosis model, which utilises the Root Cause Analysis approach and takes into account the technical features of a digital substation. The Dempster/Shafer evidence theory is used to integrate different types of fault information in the diagnosis model so as to implement a hierarchical, systematic and comprehensive diagnosis based on the logic relationship between the parent and child nodes such as transformer/circuit-breaker/transmission-line, and between the root and child causes. A real fault scenario is investigated in the case study to demonstrate the developed approach in diagnosing malfunction of protective relays and/or circuit breakers, miss or false alarms, and other commonly encountered faults at a modern digital substation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modern structural diagnosis process is rely on vibration characteristics to assess safer serviceability level of the structure. This paper examines the potential of change in flexibility method to use in damage detection process and two main practical constraints associated with it. The first constraint addressed in this paper is reduction in number of data acquisition points due to limited number of sensors. Results conclude that accuracy of the change in flexibility method is influenced by the number of data acquisition points/sensor locations in real structures. Secondly, the effect of higher modes on damage detection process has been studied. This addresses the difficulty of extracting higher order modal data with available sensors. Four damage indices have been presented to identify their potential of damage detection with respect to different locations and severity of damage. A simply supported beam with two degrees of freedom at each node is considered only for a single damage cases throughout the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Facial expression is an important channel of human social communication. Facial expression recognition (FER) aims to perceive and understand emotional states of humans based on information in the face. Building robust and high performance FER systems that can work in real-world video is still a challenging task, due to the various unpredictable facial variations and complicated exterior environmental conditions, as well as the difficulty of choosing a suitable type of feature descriptor for extracting discriminative facial information. Facial variations caused by factors such as pose, age, gender, race and occlusion, can exert profound influence on the robustness, while a suitable feature descriptor largely determines the performance. Most present attention on FER has been paid to addressing variations in pose and illumination. No approach has been reported on handling face localization errors and relatively few on overcoming facial occlusions, although the significant impact of these two variations on the performance has been proved and highlighted in many previous studies. Many texture and geometric features have been previously proposed for FER. However, few comparison studies have been conducted to explore the performance differences between different features and examine the performance improvement arisen from fusion of texture and geometry, especially on data with spontaneous emotions. The majority of existing approaches are evaluated on databases with posed or induced facial expressions collected in laboratory environments, whereas little attention has been paid on recognizing naturalistic facial expressions on real-world data. This thesis investigates techniques for building robust and high performance FER systems based on a number of established feature sets. It comprises of contributions towards three main objectives: (1) Robustness to face localization errors and facial occlusions. An approach is proposed to handle face localization errors and facial occlusions using Gabor based templates. Template extraction algorithms are designed to collect a pool of local template features and template matching is then performed to covert these templates into distances, which are robust to localization errors and occlusions. (2) Improvement of performance through feature comparison, selection and fusion. A comparative framework is presented to compare the performance between different features and different feature selection algorithms, and examine the performance improvement arising from fusion of texture and geometry. The framework is evaluated for both discrete and dimensional expression recognition on spontaneous data. (3) Evaluation of performance in the context of real-world applications. A system is selected and applied into discriminating posed versus spontaneous expressions and recognizing naturalistic facial expressions. A database is collected from real-world recordings and is used to explore feature differences between standard database images and real-world images, as well as between real-world images and real-world video frames. The performance evaluations are based on the JAFFE, CK, Feedtum, NVIE, Semaine and self-collected QUT databases. The results demonstrate high robustness of the proposed approach to the simulated localization errors and occlusions. Texture and geometry have different contributions to the performance of discrete and dimensional expression recognition, as well as posed versus spontaneous emotion discrimination. These investigations provide useful insights into enhancing robustness and achieving high performance of FER systems, and putting them into real-world applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter attends to the legal and political geographies of one of Earth's most important, valuable, and pressured spaces: the geostationary orbit. Since the first, NASA, satellite entered it in 1964, this small, defined band of Outer Space, 35,786km from the Earth's surface, and only 30km wide, has become a highly charged legal and geopolitical environment, yet it remains a space which is curiously unheard of outside of specialist circles. For the thousands of satellites which now underpin the Earth's communication, media, and data industries and flows, the geostationary orbit is the prime position in Space. The geostationary orbit only has the physical capacity to hold approximately 1500 satellites; in 1997 there were approximately 1000. It is no overstatement to assert that media, communication, and data industries would not be what they are today if it was not for the geostationary orbit. This chapter provides a critical legal geography of the geostationary orbit, charting the topography of the debates and struggles to define and manage this highly-important space. Drawing on key legal documents such as the Outer Space Treaty and the Moon Treaty, the chapter addresses fundamental questions about the legal geography of the orbit, questions which are of growing importance as the orbit’s available satellite spaces diminish and the orbit comes under increasing pressure. Who owns the geostationary orbit? Who, and whose rules, govern what may or may not (literally) take place within it? Who decides which satellites can occupy the orbit? Is the geostationary orbit the sovereign property of the equatorial states it supertends, as these states argued in the 1970s? Or is it a part of the res communis, or common property of humanity, which currently legally characterises Outer Space? As challenges to the existing legal spatiality of the orbit from launch states, companies, and potential launch states, it is particularly critical that the current spatiality of the orbit is understood and considered. One of the busiest areas of Outer Space’s spatiality is international territorial law. Mentions of Space law tend to evoke incredulity and ‘little green men’ jokes, but as Space becomes busier and busier, international Space law is growing in complexity and importance. The chapter draws on two key fields of research: cultural geography, and critical legal geography. The chapter is framed by the cultural geographical concept of ‘spatiality’, a term which signals the multiple and dynamic nature of geographical space. As spatial theorists such as Henri Lefebvre assert, a space is never simply physical; rather, any space is always a jostling composite of material, imagined, and practiced geographies (Lefebvre 1991). The ways in which a culture perceives, represents, and legislates that space are as constitutive of its identity--its spatiality--as the physical topography of the ground itself. The second field in which this chapter is situated—critical legal geography—derives from cultural geography’s focus on the cultural construction of spatiality. In his Law, Space and the Geographies of Power (1994), Nicholas Blomley asserts that analyses of territorial law largely neglect the spatial dimension of their investigations; rather than seeing the law as a force that produces specific kinds of spaces, they tend to position space as a neutral, universally-legible entity which is neatly governed by the equally neutral 'external variable' of territorial law (28). 'In the hegemonic conception of the law,' Pue similarly argues, 'the entire world is transmuted into one vast isotropic surface' (1990: 568) on which law simply acts. But as the emerging field of critical legal geography demonstrates, law is not a neutral organiser of space, but is instead a powerful cultural technology of spatial production. Or as Delaney states, legal debates are “episodes in the social production of space” (2001, p. 494). International territorial law, in other words, makes space, and does not simply govern it. Drawing on these tenets of the field of critical legal geography, as well as on Lefebvrian concept of multipartite spatiality, this chapter does two things. First, it extends the field of critical legal geography into Space, a domain with which the field has yet to substantially engage. Second, it demonstrates that the legal spatiality of the geostationary orbit is both complex and contested, and argues that it is crucial that we understand this dynamic legal space on which the Earth’s communications systems rely.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Conventionally the fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve given in ISO 834 (ISO, 1999). The standard time-temperature curve given in ISO 834 (ISO, 1999) originated from the application of wood burning furnaces in the early 1900s. However, modern commercial and residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the performance of LSF walls was undertaken using the developed real fire curves based on Eurocode parametric curves (ECS, 2002) and Barnett’s BFD curves (Barnett, 2002) using both full scale fire tests and numerical studies. It included LSF walls without any insulation, and the recently developed externally insulated composite panel system. This paper presents the details of the numerical studies and the results. It also includes brief details of the development of real building fire curves and experimental studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Fire resistance has become an important part in structural design due to the ever increasing loss of properties and lives every year. Conventionally the fire rating of load bearing Light gauge Steel Frame (LSF) walls is determined using standard fire tests based on the time-temperature curve given in ISO 834 [1]. Full scale fire testing based on this standard time-temperature curve originated from the application of wood burning furnaces in the early 1900s and it is questionable whether it truly represents the fuel loads in modern buildings. Hence a detailed fire research study into the performance of LSF walls was undertaken using real design fires based on Eurocode parametric curves [2] and Barnett’s ‘BFD’ curves [3]. This paper presents the development of these real fire curves and the results of full scale experimental study into the structural and fire behaviour of load bearing LSF stud wall systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a real-time vision based power line extraction solution is investigated for active UAV guidance. The line extraction algorithm starts from ridge points detected by steerable filters. A collinear line segments fitting algorithm is followed up by considering global and local information together with multiple collinear measurements. GPU boosted algorithm implementation is also investigated in the experiment. The experimental result shows that the proposed algorithm outperforms two baseline line detection algorithms and is able to fitting long collinear line segments. The low computational cost of the algorithm make suitable for real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a combined structure for using real, complex, and binary valued vectors for semantic representation. The theory, implementation, and application of this structure are all significant. For the theory underlying quantum interaction, it is important to develop a core set of mathematical operators that describe systems of information, just as core mathematical operators in quantum mechanics are used to describe the behavior of physical systems. The system described in this paper enables us to compare more traditional quantum mechanical models (which use complex state vectors), alongside more generalized quantum models that use real and binary vectors. The implementation of such a system presents fundamental computational challenges. For large and sometimes sparse datasets, the demands on time and space are different for real, complex, and binary vectors. To accommodate these demands, the Semantic Vectors package has been carefully adapted and can now switch between different number types comparatively seamlessly. This paper describes the key abstract operations in our semantic vector models, and describes the implementations for real, complex, and binary vectors. We also discuss some of the key questions that arise in the field of quantum interaction and informatics, explaining how the wide availability of modelling options for different number fields will help to investigate some of these questions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Destruction of cancer cells by genetically modified viral and nonviral vectors has been the aim of many research programs. The ability to target cytotoxic gene therapies to the cells of interest is an essential prerequisite, and the treatment has always had the potential to provide better and more long-lasting therapy than existing chemotherapies. However, the potency of these infectious agents requires effective testing systems, in which hypotheses can be explored both in vitro and in vivo before the establishment of clinical trials in humans. The real prospect of off-target effects should be eliminated in the preclinical stage, if current prejudices against such therapies are to be overcome. In this review we have set out, using adenoviral vectors as a commonly used example, to discuss some of the key parameters required to develop more effective testing, and to critically assess the current cellular models for the development and testing of prostate cancer biotherapy. Only by developing models that more closely mirror human tissues will we be able to translate literature publications into clinical trials and hence into acceptable alternative treatments for the most commonly diagnosed cancer in humans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an efficient face detection method suitable for real-time surveillance applications. Improved efficiency is achieved by constraining the search window of an AdaBoost face detector to pre-selected regions. Firstly, the proposed method takes a sparse grid of sample pixels from the image to reduce whole image scan time. A fusion of foreground segmentation and skin colour segmentation is then used to select candidate face regions. Finally, a classifier-based face detector is applied only to selected regions to verify the presence of a face (the Viola-Jones detector is used in this paper). The proposed system is evaluated using 640 x 480 pixels test images and compared with other relevant methods. Experimental results show that the proposed method reduces the detection time to 42 ms, where the Viola-Jones detector alone requires 565 ms (on a desktop processor). This improvement makes the face detector suitable for real-time applications. Furthermore, the proposed method requires 50% of the computation time of the best competing method, while reducing the false positive rate by 3.2% and maintaining the same hit rate.