410 resultados para COMPLEX NETWORKS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Discrete event-driven simulations of digital communication networks have been used widely. However, it is difficult to use a network simulator to simulate a hybrid system in which some objects are not discrete event-driven but are continuous time-driven. A networked control system (NCS) is such an application, in which physical process dynamics are continuous by nature. We have designed and implemented a hybrid simulation environment which effectively integrates models of continuous-time plant processes and discrete-event communication networks by extending the open source network simulator NS-2. To do this a synchronisation mechanism was developed to connect a continuous plant simulation with a discrete network simulation. Furthermore, for evaluating co-design approaches in an NCS environment, a piggybacking method was adopted to allow the control period to be adjusted during simulations. The effectiveness of the technique is demonstrated through case studies which simulate a networked control scenario in which the communication and control system properties are defined explicitly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Australia, the Queensland fruit fly (B. tryoni), is the most destructive insect pest of horticulture, attacking nearly all fruit and vegetable crops. This project has researched and prototyped a system for monitoring fruit flies so that authorities can be alerted when a fly enters a crop in a more efficient manner than is currently used. This paper presents the idea of our sensor platform design as well as the fruit fly detection and recognition algorithm by using machine vision techniques. Our experiments showed that the designed trap and sensor platform is capable to capture quality fly images, the invasive flies can be successfully detected and the average precision of the Queensland fruit fly recognition is 80% from our experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From the perspective of network, a project team’s social capital consists of conduits network, and resource exchange network. Prior research intensively studies the effect of the structure of conduits network on the team’s performance, assuming knowledge transfer is the causal mechanism linking conduits network to performance. This paper attempts to explore the interrelations between conduits network and knowledge network, and further distinguish the different influence between various conduit networks, and hypothesizes that a project team’s knowledge network mediates the effect of various conduit networks on the team’s performance. This research can enrich our knowledge of disparate influence of the various conduit networks on knowledge transfer, and imply some management practices to enhance the organization’s social capital, and hence improve the organization’s performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces friendwork as a new term in social networks studies. A friendwork is a network of friends. It is a specific case of an interpersonal social network. Naming this seemingly well known and familiar group of people as a friendwork facilitates its differentiation from the overall social network, while highlighting this subgroup's specific attributes and dynamics. The focus on one segment within social networks stimulates a wider discussion regarding the different subgroups within social networks. Other subgroups also discussed in this paper are: family dependent, work related, location based and virtual acquaintances networks. This discussion informs a larger study of social media, specifically addressing interactive communication modes that are in use within friendworks: direct (face-to-face) and mediated (mainly fixed telephone, internet and mobile phone). It explores the role of social media within friendworks while providing a communication perspective on social networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research on project learning has recognised the significance of knowledge transfer in project based organisations (PBOs). Effective knowledge transfer across projects avoids reinventions, enhances knowledge creation and saves lots of time that is crucial in project environment. In order to facilitate knowledge transfer, many PBOs have invested lots of financial and human resources to implement IT-based knowledge repository. However, some empirical studies found that employees would rather turn for knowledge to colleagues despite their ready access to IT-based knowledge repository. Therefore, it is apparent that social networks play a pivotal role in the knowledge transfer across projects. Some scholars attempt to explore the effect of network structure on knowledge transfer and performance, however, focused only on egocentric networks and the groups’ internal social networks. It has been found that the project’s external social network is also critical, in that the team members can not handle critical situations and accomplish the projects on time without the assistance and knowledge from external sources. To date, the influence of the structure of a project team’s internal and external social networks on project performance, and the interrelation between both networks are barely known. In order to obtain such knowledge, this paper explores the interrelation between the structure of a project team’s internal and external social networks, and their effect on the project team’s performance. Data is gathered through survey questionnaire distributed online to respondents. Collected data is analysed applying social network analysis (SNA) tools and SPSS. The theoretical contribution of this paper is the knowledge of the interrelation between the structure of a project team’s internal and external social networks and their influence on the project team’s performance. The practical contribution lies in the guideline to be proposed for constructing the structure of project team’s internal and external social networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel algorithm for the gateway placement problem in Backbone Wireless Mesh Networks (BWMNs). Different from existing algorithms, the new algorithm incrementally identifies gateways and assigns mesh routers to identified gateways. The new algorithm can guarantee to find a feasible gateway placement satisfying Quality-of-Service (QoS) constraints, including delay constraint, relay load constraint and gateway capacity constraint. Experimental results show that its performance is as good as that of the best of existing algorithms for the gateway placement problem. But, the new algorithm can be used for BWMNs that do not form one connected component, and it is easy to implement and use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter elucidates key ideas behind neurocomputational and ecological dynamics and perspectives of understanding the organisation of action in complex neurobiological systems. The need to study the close link between neurobiological systems and their environments (particularly their sensory and movement subsystems and the surrounding energy sources) is advocated. It is proposed how degeneracy in complex neurobiological systems provides the basis for functional variability in organisation of action. In such systems processes of cognition and action facilitate the specific interactions of each performer with particular task and environmental constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, the aquisition of skills and sport movement has been characterised by numerous repetitions of presumed model movement pattern to be acquired by learners. This approach has been questioned by research identifying the presence of individualised movement patterns and the low probability of occurrence of two identical movements within and between individuals. In contrast, the differential learning approach claims advantage for incurring variability in the learning process by adding stochastic perturbations during practice. These ideas are exemplified by data from a high jump experiment which compared the effectiveness of classical and a differential training approach with pre-post test design. Results showed clear advantages for the group with additional stochastic perturbation during the aquisition phase in comparison to classically trained athletes. Analogies to similar phenomenological effects in the neurobiological literature are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networksComplex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the study of complex neurobiological movement systems, measurement indeterminacy has typically been overcome by imposing artificial modelling constraints to reduce the number of unknowns (e.g., reducing all muscle, bone and ligament forces crossing a joint to a single vector). However, this approach prevents human movement scientists from investigating more fully the role, functionality and ubiquity of coordinative structures or functional motor synergies. Advancements in measurement methods and analysis techniques are required if the contribution of individual component parts or degrees of freedom of these task-specific structural units is to be established, thereby effectively solving the indeterminacy problem by reducing the number of unknowns. A further benefit of establishing more of the unknowns is that human movement scientists will be able to gain greater insight into ubiquitous processes of physical self-organising that underpin the formation of coordinative structures and the confluence of organismic, environmental and task constraints that determine the exact morphology of these special-purpose devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review explores the question whether chemometrics methods enhance the performance of electroanalytical methods. Electroanalysis has long benefited from the well-established techniques such as potentiometric titrations, polarography and voltammetry, and the more novel ones such as electronic tongues and noses, which have enlarged the scope of applications. The electroanalytical methods have been improved with the application of chemometrics for simultaneous quantitative prediction of analytes or qualitative resolution of complex overlapping responses. Typical methods include partial least squares (PLS), artificial neural networks (ANNs), and multiple curve resolution methods (MCR-ALS, N-PLS and PARAFAC). This review aims to provide the practising analyst with a broad guide to electroanalytical applications supported by chemometrics. In this context, after a general consideration of the use of a number of electroanalytical techniques with the aid of chemometrics methods, several overviews follow with each one focusing on an important field of application such as food, pharmaceuticals, pesticides and the environment. The growth of chemometrics in conjunction with electronic tongue and nose sensors is highlighted, and this is followed by an overview of the use of chemometrics for the resolution of complicated profiles for qualitative identification of analytes, especially with the use of the MCR-ALS methodology. Finally, the performance of electroanalytical methods is compared with that of some spectrophotometric procedures on the basis of figures-of-merit. This showed that electroanalytical methods can perform as well as the spectrophotometric ones. PLS-1 appears to be the method of practical choice if the %relative prediction error of not, vert, similar±10% is acceptable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The molecular and metal profile fingerprints were obtained from a complex substance, Atractylis chinensis DC—a traditional Chinese medicine (TCM), with the use of the high performance liquid chromatography (HPLC) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) techniques. This substance was used in this work as an example of a complex biological material, which has found application as a TCM. Such TCM samples are traditionally processed by the Bran, Cut, Fried and Swill methods, and were collected from five provinces in China. The data matrices obtained from the two types of analysis produced two principal component biplots, which showed that the HPLC fingerprint data were discriminated on the basis of the methods for processing the raw TCM, while the metal analysis grouped according to the geographical origin. When the two data matrices were combined into a one two-way matrix, the resulting biplot showed a clear separation on the basis of the HPLC fingerprints. Importantly, within each different grouping the objects separated according to their geographical origin, and they ranked approximately in the same order in each group. This result suggested that by using such an approach, it is possible to derive improved characterisation of the complex TCM materials on the basis of the two kinds of analytical data. In addition, two supervised pattern recognition methods, K-nearest neighbors (KNNs) method, and linear discriminant analysis (LDA), were successfully applied to the individual data matrices—thus, supporting the PCA approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nature and organisation of creative industries and the creative economy has received increased attention in recent academic and policy literatures (Florida 2002; Grabher 2002; Scott 2006a). Constituted as one variant on new economy narratives, creativity, alongside knowledge, has been presented as a key competitive asset, Such industries – ranging from advertising, to film and new media – are seen as not merely expanding their scale and scope, but as leading edge proponents of a more general trend towards new forms of organization and economic coordination (Davis and Scase 2000). The idea of network forms (and the consequent displacement of markets and hierarchies) has been at the heart of attempts to differentiate the field economically and spatially. Across both the discussion of production models and work/employment relations is the assertion of the enhanced importance of trust and non-market relations in coordinating structures and practices. This reflects an influential view in sociological, management, geography and other literatures that social life is ‘intrinsically networked’ (Sunley 2008: 12) and that we can confidently use the term ‘network society’ to describe contemporary structures and practices (Castells 1996). Our paper is sceptical of the conceptual and empirical foundations of such arguments. We draw on a number of theoretical resources, including institutional theory, global value chain analysis and labour process theory (see Smith and McKinlay 2009) to explore how a more realistic and grounded analysis of the nature of and limits to networks can be articulated. Given space constraints, we cannot address all the dimensions of network arguments or evidence. Our focus is on inter and intra-firm relations and draws on research into a particular creative industry – visual effects – that is a relatively new though increasingly important global production network. Through this examination a different model of the creative industries and creative work emerges – one in which market rules and patterns of hierarchical interaction structure the behaviour of economic actors and remain a central focus of analysis. The next section outlines and unpacks in more detail arguments concerning the role and significance of networks, markets and hierarchies in production models and work organisation in creative industries and the ‘creative economy’.