431 resultados para virtual topology, decomposition, hex meshing algorithms
Resumo:
Railway service is now the major transportation means in most of the countries around the world. With the increasing population and expanding commercial and industrial activities, a high quality of railway service is the most desirable. We present an application of genetic algorithms (GA) to search for the appropriate coasting point(s) and investigate the possible improvement on fitness of genes. Single and multiple coasting point control with simple GA are developed to attain the solutions and their corresponding train movement is examined. The multiple coasting point control with hierarchical genetic algorithm (HGA) is then proposed to integrate the determination of the number of coasting points.
Resumo:
This paper explores the idea of virtual participation through the historical example of the republic of letters in early modern Europe (circa 1500-1800). By reflecting on the construction of virtuality in a historical context, and more specifically in a pre-digital environment, this paper calls attention to accusations of technological determinism in ongoing research concerning the affordances of the Internet and related media of communication. It argues that ‘the virtual’ is not synonymous with ‘the digital’ and suggests that, in order to articulate what is novel about modern technologies, we must first understand the social interactions underpinning the relationships which are facilitated through those technologies. By analysing the construction of virtuality in a pre-digital environment, this paper thus offers a baseline from which scholars might consider what is different about the modes of interaction and communication being engaged in via modern media.
Resumo:
Streaming SIMD Extensions (SSE) is a unique feature embedded in the Pentium III and P4 classes of microprocessors. By fully exploiting SSE, parallel algorithms can be implemented on a standard personal computer and a theoretical speedup of four can be achieved. In this paper, we demonstrate the implementation of a parallel LU matrix decomposition algorithm for solving power systems network equations with SSE and discuss advantages and disadvantages of this approach.
Resumo:
Streaming SIMD Extensions (SSE) is a unique feature embedded in the Pentium III class of microprocessors. By fully exploiting SSE, parallel algorithms can be implemented on a standard personal computer and a theoretical speedup of four can be achieved. In this paper, we demonstrate the implementation of a parallel LU matrix decomposition algorithm for solving power systems network equations with SSE and discuss advantages and disadvantages of this approach.
Resumo:
In this paper, a rate-based flow control scheme based upon per-VC virtual queuing is proposed for the Available Bit Rate (ABR) service in ATM. In this scheme, each VC in a shared buffer is assigned a virtual queue, which is a counter. To achieve a specific kind of fairness, an appropriate scheduler is applied to the virtual queues. Each VC's bottleneck rate (fair share) is derived from its virtual cell departure rate. This approach of deriving a VC's fair share is simple and accurate. By controlling each VC with respect to its virtual queue and queue build-up in the shared buffer, network congestion is avoided. The principle of the control scheme is first illustrated by max–min flow control, which is realised by scheduling the virtual queues in round-robin. Further application of the control scheme is demonstrated with the achievement of weighted fairness through weighted round robin scheduling. Simulation results show that with a simple computation, the proposed scheme achieves the desired fairness exactly and controls network congestion effectively.
Resumo:
Professor Christian Langton is a medical physicist at Queensland University of Technology in Brisbane. He has developed a way of preparing children who are about to have either radiotherapy or MRI imaging procedures and is seeking research partners to develop and test these further. This is a great opportunity for nurses interested in research, and who have access to a children’s hospital, to work with Professor Langton on some truly innovative, multidisciplinary research.
Resumo:
Streaming SIMD Extensions (SSE) is a unique feature embedded in the Pentium III and IV classes of microprocessors. By fully exploiting SSE, parallel algorithms can be implemented on a standard personal computer and a theoretical speedup of four can be achieved. In this paper, we demonstrate the implementation of a parallel LU matrix decomposition algorithm for solving linear systems with SSE and discuss advantages and disadvantages of this approach based on our experimental study.
Resumo:
This workshop focuses upon research about the qualities of community in music and of music in community facilitated by technologically supported relationships. Generative media systems present an opportunity for users to leverage computational systems to form new relationships through interactive and collaborative experiences. Generative music and art are a relatively new phenomenon that use procedural invention as a creative technique to produce music and visual media. Early systems have demonstrated the potential to provide access to collaborative ensemble experiences for users with little formal musical or artistic expertise. This workshop examines the relational affordances of these systems evidenced by selected field data drawn from the Network Jamming Project. These generative performance systems enable access to unique ensembles with very little musical knowledge or skill and offer the possibility of interactive relationships with artists and musical knowledge through collaborative performance. In this workshop we will focus on data that highlights how these simulated experiences might lead to understandings that may be of social benefit. Conference participants will be invited to jam in real time using virtual interfaces and to evaluate purposively selected video artifacts that demonstrate different kinds of interactive relationship with artists, peers, and community and that enrich the sense of expressive self. Theoretical insights about meaningful engagement drawn from the longitudinal and cross cultural experiences will underpin the discussion and practical presentation.
Resumo:
Music making affects relationships with self and others by generating a sense of belonging to a culture or ideology (Bamford, 2006; Barovick, 2001; Dillon & Stewart, 2006; Fiske, 2000; Hallam, 2001). Whilst studies from arts education research present compelling examples of these relationships, others argue that they do not present sufficiently validated evidence of a causal link between music making experiences and cognitive or social change (Winner & Cooper, 2000; Winner & Hetland, 2000a, 2000b, 2001). I have suggested elsewhere that this disconnection between compelling evidence and observations of the effects of music making are in part due to the lack of rigor in research and the incapacity of many methods to capture these experiences in meaningful ways (Dillon, 2006). Part of the answer to these questions about rigor and causality lay in the creative use of new media technologies that capture the results of relationships in music artefacts. Crucially, it is the effective management of these artefacts within computer systems that allows researchers and practitioners to collect, organize, analyse and then theorise such music making experiences.
Resumo:
In a seminal data mining article, Leo Breiman [1] argued that to develop effective predictive classification and regression models, we need to move away from the sole dependency on statistical algorithms and embrace a wider toolkit of modeling algorithms that include data mining procedures. Nevertheless, many researchers still rely solely on statistical procedures when undertaking data modeling tasks; the sole reliance on these procedures has lead to the development of irrelevant theory and questionable research conclusions ([1], p.199). We will outline initiatives that the HPC & Research Support group is undertaking to engage researchers with data mining tools and techniques; including a new range of seminars, workshops, and one-on-one consultations covering data mining algorithms, the relationship between data mining and the research cycle, and limitations and problems with these new algorithms. Organisational limitations and restrictions to these initiatives are also discussed.
Resumo:
Traffic control at a road junction by a complex fuzzy logic controller is investigated. The increase in the complexity of junction means more number of input variables must be taken into account, which will increase the number of fuzzy rules in the system. A hierarchical fuzzy logic controller is introduced to reduce the number of rules. Besides, the increase in the complexity of the controller makes formulation of the fuzzy rules difficult. A genetic algorithm based off-line leaning algorithm is employed to generate the fuzzy rules. The learning algorithm uses constant flow-rates as training sets. The system is tested by both constant and time-varying flow-rates. Simulation results show that the proposed controller produces lower average delay than a fixed-time controller does under various traffic conditions.
Resumo:
This paper presents a Genetic Algorithms (GA) approach to search the optimized path for a class of transportation problems. The formulation of the problems for suitable application of GA will be discussed. Exchanging genetic information in the sense of neighborhoods will be introduced for generation reproduction. The performance of the GA will be evaluated by computer simulation. The proposed algorithm use simple coding with population size 1 converged in reasonable optimality within several minutes.
Resumo:
This paper reports on the empirical comparison of seven machine learning algorithms in texture classification with application to vegetation management in power line corridors. Aiming at classifying tree species in power line corridors, object-based method is employed. Individual tree crowns are segmented as the basic classification units and three classic texture features are extracted as the input to the classification algorithms. Several widely used performance metrics are used to evaluate the classification algorithms. The experimental results demonstrate that the classification performance depends on the performance matrix, the characteristics of datasets and the feature used.
Resumo:
The study evaluated two student online contemporary learning environments; Second Life and Facebook, student learning experiences and student knowledge outcomes. A case study methodology was used to gain rich exploratory knowledge of student learning when integrating online social networks (OSN) and virtual worlds (VW) platforms. Findings indicated students must perceive relevance in the activities when using such platforms, even though online environments create an interesting learning space for students and educators, the novelty can diminish quickly and these online environments dilute traditional authority boundaries.
Resumo:
Computation Fluid Dynamics (CFD) has become an important tool in optimization and has seen successful in many real world applications. Most important among these is in the optimisation of aerodynamic surfaces which has become Multi-Objective (MO) and Multidisciplinary (MDO) in nature. Most of these have been carried out for a given set of input parameters such as free stream Mach number and angle of attack. One cannot ignore the fact that in aerospace engineering one frequently deals with situations where the design input parameters and flight/flow conditions have some amount of uncertainty attached to them. When the optimisation is carried out for fixed values of design variables and parameters however, one arrives at an optimised solution that results in good performance at design condition but poor drag or lift to drag ratio at slightly off-design conditions. The challenge is still to develop a robust design that accounts for uncertainty in the design in aerospace applications. In this paper this issue is taken up and an attempt is made to prevent the fluctuation of objective performance by using robust design technique or Uncertainty.