281 resultados para Simulation of Digital Communication Systems

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discrete event-driven simulations of digital communication networks have been used widely. However, it is difficult to use a network simulator to simulate a hybrid system in which some objects are not discrete event-driven but are continuous time-driven. A networked control system (NCS) is such an application, in which physical process dynamics are continuous by nature. We have designed and implemented a hybrid simulation environment which effectively integrates models of continuous-time plant processes and discrete-event communication networks by extending the open source network simulator NS-2. To do this a synchronisation mechanism was developed to connect a continuous plant simulation with a discrete network simulation. Furthermore, for evaluating co-design approaches in an NCS environment, a piggybacking method was adopted to allow the control period to be adjusted during simulations. The effectiveness of the technique is demonstrated through case studies which simulate a networked control scenario in which the communication and control system properties are defined explicitly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A comparison of relay power minimisation subject to received signal-to-noise ratio (SNR) at the receiver and SNR maximisation subject to the total transmitted power of relays for a typical wireless network with distributed beamforming is presented. It is desirable to maximise receiver quality-of-service (QoS) and also to minimise the cost of transmission in terms of power. Hence, these two optimisation problems are very common and have been addressed separately in the literature. It is shown that SNR maximisation subject to power constraint and power minimisation subject to SNR constraint yield the same results for a typical wireless network. It proves that either one of the optimisation approaches is sufficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Language is a unique aspect of human communication because it can be used to discuss itself in its own terms. For this reason, human societies potentially have superior capacities of co-ordination, reflexive self-correction, and innovation than other animal, physical or cybernetic systems. However, this analysis also reveals that language is interconnected with the economically and technologically mediated social sphere and hence is vulnerable to abstraction, objectification, reification, and therefore ideology – all of which are antithetical to its reflexive function, whilst paradoxically being a fundamental part of it. In particular, in capitalism, language is increasingly commodified within the social domains created and affected by ubiquitous communication technologies. The advent of the so-called ‘knowledge economy’ implicates exchangeable forms of thought (language) as the fundamental commodities of this emerging system. The historical point at which a ‘knowledge economy’ emerges, then, is the critical point at which thought itself becomes a commodified ‘thing’, and language becomes its “objective” means of exchange. However, the processes by which such commodification and objectification occurs obscures the unique social relations within which these language commodities are produced. The latest economic phase of capitalism – the knowledge economy – and the obfuscating trajectory which accompanies it, we argue, is destroying the reflexive capacity of language particularly through the process of commodification. This can be seen in that the language practices that have emerged in conjunction with digital technologies are increasingly non-reflexive and therefore less capable of self-critical, conscious change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interaction of Au particles with few layer graphene is of interest for the formation of the next generation of sensing devices(1). In this paper we investigate the coupling of single gold nanoparticles to a graphene sheet, and multiple gold nanoparticles with a graphene sheet using COMSOL Multiphysics. By using these simulations we are able to determine the electric field strength and associated hot-spots for various gold nanoparticle-graphene systems. The Au nanoparticles were modelled as 8 nm diameter spheres on 1.5 nm thick (5 layers) graphene, with properties of graphene obtained from the refractive index data of Weber(2) and the Au refractive index data from Palik(3). The field was incident along the plane of the sheet with polarisation tested for both s and p. The study showed strong localised interaction between the Au and graphene with limited spread; however the double particle case where the graphene sheet separated two Au nanoparticles showed distinct interaction between the particles and graphene. An offset was introduced (up to 4 nm) resulting in much reduced coupling between the opposed particles as the distance apart increased. Findings currently suggest that the graphene layer has limited interaction with incident fields with a single particle present whilst reducing the coupling region to a very fine area when opposing particles are involved. It is hoped that the results of this research will provide insight into graphene-plasmon interactions and spur the development of the next generation of sensing devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Beginning in the second half of the 20th century, ICTs transformed many societies from industrial societies in which manufacturing was the central focus, into knowledge societies in which dealing effectively with data and information has become a central element of work (Anderson, 2008). To meet the needs of the knowledge society, universities must reinvent their structures and processes, their curricula and pedagogic practices. In addition to this, of course higher education is itself subject to the sweeping influence of ICTs. But what might effective higher education look like in the 21st century? In designing higher education systems and learning experiences which are responsive to the learning needs of the future and exploit the possibilities offered by ICTs, we can learn much from the existing professional development strategies of people who are already successful in 21st century fields, such as digital media. In this study, I ask: (1) what are the learning challenges faced by digital media professionals in the 21st century? (2) what are the various roles of formal and informal education in their professional learning strategies at present? (3) how do they prefer to acquire needed capabilities? In-depth interviews were undertaken with successful Australian digital media professionals working in micro businesses and SMEs to answer these questions. The strongest thematic grouping that emerged from the interviews related to the need for continual learning and relearning because of the sheer rate of change in the digital media industries. Four dialectical relationships became apparent from the interviewees’ commentaries around the learning imperatives arising out of the immense and continual changes occurring in the digital content industries: (1) currency vs best practice (2) diversification vs specialisation of products and services (3) creative outputs vs commercial outcomes (4) more learning opportunities vs less opportunity to learn. These findings point to the importance of ‘learning how to learn’ as a 21st century capability. The interviewees were ambivalent about university courses as preparation for professional life in their fields. Higher education was described by several interviewees as having relatively little value-add beyond what one described as “really expensive credentialling services.” For all interviewees in this study, informal learning strategies were the preferred methods of acquiring the majority of knowledge and skills, both for ongoing and initial professional development. Informal learning has no ‘curriculum’ per se, and tends to be opportunistic, unstructured, pedagogically agile and far more self-directed than formal learning (Eraut, 2004). In an industry impacted by constant change, informal learning is clearly both essential and ubiquitous. Inspired by the professional development strategies of the digital media professionals in this study, I propose a 21st century model of the university as a broad, open learning ecology, which also includes industry, professionals, users, and university researchers. If created and managed appropriately, the university learning network becomes the conduit and knowledge integrator for the latest research and industry trends, which students and professionals alike can access as needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban agriculture plays an increasingly vital role in supplying food to urban populations. Changes in Information and Communications Technology (ICT) are already driving widespread change in diverse food-related industries such as retail, hospitality and marketing. It is reasonable to suspect that the fields of ubiquitous technology, urban informatics and social media equally have a lot to offer the evolution of core urban food systems. We use communicative ecology theory to describe emerging innovations in urban food systems according to their technical, discursive and social components. We conclude that social media in particular accentuate fundamental social interconnections normally effaced by conventional industrialised approaches to food production and consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical simulation of a geothermal reservoir, modelled as a bottom-heated square box, filled with water-CO2 mixture is presented in this work. Furthermore, results for two limiting cases of a reservoir filled with either pure water or CO2 are presented. Effects of different parameters including CO2 concentration as well as reservoir pressure and temperature on the overall performance of the system are investigated. It has been noted that, with a fixed reservoir pressure and temperature, any increase in CO2concentration leads to better performance, i.e. stronger convection and higher heat transfer rates. With a fixed CO2 concentration, however, the reservoir pressure and temperature can significantly affect the overall heat transfer and flow rate from the reservoir. Details of such variations are documented and discussed in the present paper.