330 resultados para BASIS-SET CONVERGENCE
Resumo:
A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.
Resumo:
An ambitious rendering of the digital future from a pioneer of media and cultural studies, a wise and witty take on a changing field, and our orientation to it Investigates the uses of multimedia by creative and productive citizen-consumers to provide new theories of communication that accommodate social media, participatory action, and user-creativity Leads the way for new interdisciplinary engagement with systems thinking, complexity and evolutionary sciences, and the convergence of cultural and economic values Analyzes the historical uses of multimedia from print, through broadcasting to the internet Combines conceptual innovation with historical erudition to present a high-level synthesis of ideas and detailed analysis of emergent forms and practices Features an international focus and global reach to provide a basis for students and researchers seeking broader perspectives
Resumo:
The development of public service broadcasters (PSBs) in the 20th century was framed around debates about its difference compared to commercial broadcasting. These debates navigated between two poles. One concerned the relationship between non‐commercial sources of funding and the role played by statutory Charters as guarantors of the independence of PSBs. The other concerned the relationship between PSBs being both a complementary and a comprehensive service, although there are tensions inherent in this duality. In the 21st century, as reconfigured public service media organisations (PSMs) operate across multiple platforms in a convergent media environment, how are these debates changing, if at all? Is the case for PSM “exceptionalism” changed with Web‐based services, catch‐up TV, podcasting, ancillary product sales, and commissioning of programs from external sources in order to operate in highly diversified cross‐media environments? Do the traditional assumptions about non‐commercialism still hold as the basis for different forms of PSM governance and accountability? This paper will consider the question of PSM exceptionalism in the context of three reviews into Australian media that took place over 2011‐2012: the Convergence Review undertaken through the Department of Broadband, Communications and the Digital Economy; the National Classification Scheme Review undertaken by the Australian Law Reform Commission; and the Independent Media Inquiry that considered the future of news and journalism.
Resumo:
The majority of cancer nurses have to manage intravascular devices (IVDs) on a daily basis, thus placing nurses in the strongest position to generate and use best available evidence to inform this area of practice and to ensure that patients are receiving the best care available. Our literature clearly reflects that cancer nurses are concerned about complications associated with IVDs (eg, extravasation,1 IVD-related bloodstream infection [IVD-BSI],2,3 and thrombosis4). Although enormous attention is given to this area, a number of nursing practices are not sufficiently based on empirical evidence.5,6 Nurses need to set goals and priorities for future research and investments. Priority areas for future research are suggested here for your consideration.
Resumo:
We consider the space fractional advection–dispersion equation, which is obtained from the classical advection–diffusion equation by replacing the spatial derivatives with a generalised derivative of fractional order. We derive a finite volume method that utilises fractionally-shifted Grünwald formulae for the discretisation of the fractional derivative, to numerically solve the equation on a finite domain with homogeneous Dirichlet boundary conditions. We prove that the method is stable and convergent when coupled with an implicit timestepping strategy. Results of numerical experiments are presented that support the theoretical analysis.
Resumo:
The IEEE Subcommittee on the Application of Probability Methods (APM) published the IEEE Reliability Test System (RTS) [1] in 1979. This system provides a consistent and generally acceptable set of data that can be used both in generation capacity and in composite system reliability evaluation [2,3]. The test system provides a basis for the comparison of results obtained by different people using different methods. Prior to its publication, there was no general agreement on either the system or the data that should be used to demonstrate or test various techniques developed to conduct reliability studies. Development of reliability assessment techniques and programs are very dependent on the intent behind the development as the experience of one power utility with their system may be quite different from that of another utility. The development and the utilization of a reliability program are, therefore, greatly influenced by the experience of a utlity and the intent of the system manager, planner and designer conducting the reliability studies. The IEEE-RTS has proved to be extremely valuable in highlighting and comparing the capabilities (or incapabilities) of programs used in reliability studies, the differences in the perception of various power utilities and the differences in the solution techniques. The IEEE-RTS contains a reasonably large power network which can be difficult to use for initial studies in an educational environment.
Resumo:
Well-designed initialisation and keystream generation processes for stream ciphers should ensure that each key-IV pair generates a distinct keystream. In this paper, we analyse some ciphers where this does not happen due to state convergence occurring either during initialisation, keystream generation or both. We show how state convergence occurs in each case and identify two mechanisms which can cause state convergence.
Resumo:
Transport between compartments of eukaryotic cells is mediated by coated vesicles. The archetypal protein coats COPI, COPII, and clathrin are conserved from yeast to human. Structural studies of COPII and clathrin coats assembled in vitro without membranes suggest that coat components assemble regular cages with the same set of interactions between components. Detailed three-dimensional structures of coated membrane vesicles have not been obtained. Here, we solved the structures of individual COPI-coated membrane vesicles by cryoelectron tomography and subtomogram averaging of in vitro reconstituted budding reactions. The coat protein complex, coatomer, was observed to adopt alternative conformations to change the number of other coatomers with which it interacts and to form vesicles with variable sizes and shapes. This represents a fundamentally different basis for vesicle coat assembly.
Resumo:
A critical step in the dissemination of ovarian cancer is the formation of multicellular spheroids from cells shed from the primary tumour. The objectives of this study were to apply bioengineered three-dimensional (3D) microenvironments for culturing ovarian cancer spheroids in vitro and simultaneously to build on a mathematical model describing the growth of multicellular spheroids in these biomimetic matrices. Cancer cells derived from human epithelial ovarian carcinoma were embedded within biomimetic hydrogels of varying stiffness and grown for up to 4 weeks. Immunohistochemistry, imaging and growth analyses were used to quantify the dependence of cell proliferation and apoptosis on matrix stiffness, long-term culture and treatment with the anti-cancer drug paclitaxel. The mathematical model was formulated as a free boundary problem in which each spheroid was treated as an incompressible porous medium. The functional forms used to describe the rates of cell proliferation and apoptosis were motivated by the experimental work and predictions of the mathematical model compared with the experimental output. This work aimed to establish whether it is possible to simulate solid tumour growth on the basis of data on spheroid size, cell proliferation and cell death within these spheroids. The mathematical model predictions were in agreement with the experimental data set and simulated how the growth of cancer spheroids was influenced by mechanical and biochemical stimuli including matrix stiffness, culture duration and administration of a chemotherapeutic drug. Our computational model provides new perspectives on experimental results and has informed the design of new 3D studies of chemoresistance of multicellular cancer spheroids.
Resumo:
This paper identifies two major forces driving change in media policy worldwide: media convergence, and renewed concerns about media ethics, with the latter seen in the U.K. Leveson Inquiry. It focuses on two major public inquiries in Australia during 2011-2012 – the Independent Media Inquiry (Finkelstein Review) and the Convergence Review – and the issues raised about future regulation of journalism and news standards. Drawing upon perspectives from media theory, it observes the strong influence of social responsibility theories of the media in the Finkelstein Review, and the adverse reaction these received from those arguing from Fourth Estate/free press perspectives, which were also consistent with the longstanding opposition of Australian newspaper proprietors to government regulation. It also discusses the approaches taken in the Convergence Review to regulating for news standards, in light of the complexities arising from media convergence. The paper concludes with consideration of the fast-changing environment in which such proposals to transform media regulation are being considered, including the crisis of news media organisation business models, as seen in Australia with major layoffs of journalists from the leading print media publications.
Resumo:
Urban design that harnesses natural features (such as green roofs and green walls) to improve design outcomes is gaining significant interest, particularly as there is growing evidence of links between human health and wellbeing, and contact with nature. The use of such natural features can provide many significant benefits, such as reduced urban heat island effects, reduced peak energy demand for building cooling, enhanced stormwater attenuation and management, and reduced air pollution and greenhouse gas emissions. The principle of harnessing natural features as functional design elements, particularly in buildings, is becoming known as ‘biophilic urbanism’. Given the potential for global application and benefits for cities from biophilic urbanism, and the growing number of successful examples of this, it is timely to develop enabling policies that help overcome current barriers to implementation. This paper describes a basis for inquiry into policy considerations related to increasing the application of biophilic urbanism. The paper draws on research undertaken as part of the Sustainable Built Environment National Research Centre (SBEnrc) In Australia in partnership with the Western Australian Department of Finance, Parsons Brinckerhoff, Green Roofs Australasia, and Townsville City Council (CitySolar Program). The paper discusses the emergence of a qualitative, mixed-method approach that combines an extensive literature review, stakeholder workshops and interviews, and a detailed study of leading case studies. It highlights the importance of experiential and contextual learnings to inform biophilic urbanism and provides a structure to distil such learnings to benefit other applications.
Resumo:
Communication processes are vital in the lifecycle of BPM projects. With this in mind, much research has been performed into facilitating this key component between stakeholders. Amongst the methods used to support this process are personalized process visualisations. In this paper, we review the development of this visualization trend, then, we propose a theoretical analysis framework based upon communication theory. We use this framework to provide theoretical support to the conjecture that 3D virtual worlds are powerful tools for communicating personalised visualisations of processes within a workplace. Meta requirements are then derived and applied, via 3D virtual world functionalities, to generate example visualisations containing personalized aspects, which we believe enhance the process of communcation between analysts and stakeholders in BPM process (re)design activities.
Resumo:
Several approaches have been introduced in the literature for active noise control (ANC) systems. Since the filtered-x least-mean-square (FxLMS) algorithm appears to be the best choice as a controller filter, researchers tend to improve performance of ANC systems by enhancing and modifying this algorithm. This paper proposes a new version of the FxLMS algorithm, as a first novelty. In many ANC applications, an on-line secondary path modeling method using white noise as a training signal is required to ensure convergence of the system. As a second novelty, this paper proposes a new approach for on-line secondary path modeling on the basis of a new variable-step-size (VSS) LMS algorithm in feed forward ANC systems. The proposed algorithm is designed so that the noise injection is stopped at the optimum point when the modeling accuracy is sufficient. In this approach, a sudden change in the secondary path during operation makes the algorithm reactivate injection of the white noise to re-adjust the secondary path estimate. Comparative simulation results shown in this paper indicate the effectiveness of the proposed approach in reducing both narrow-band and broad-band noise. In addition, the proposed ANC system is robust against sudden changes of the secondary path model.
Resumo:
This report looks at opportunities in relation to what is either already available or starting to take off in Information and Communication Technology (ICT). ICT focuses on the entire system of information, communication, processes and knowledge within an organisation. It focuses on how technology can be implemented to serve the information and communication needs of people and organisations. An ICT system involves a combination of work practices, information, people and a range of technologies and applications organised to make the business or organisation fully functional and efficient, and to accomplish goals in an organisation. Our focus is on vocational, workbased education in New Zealand. It is not about eLearning, although we briefly touch on the topic. We provide a background on vocational education in New Zealand, cover what we consider to be key trends impacting workbased, vocational education and training (VET), and offer practical suggestions for leveraging better value from ICT initiatives across the main activities of an Industry Training Organisation (ITO). We use a learning value chain approach to demonstrate the main functions ITOs engage in and also use this approach as the basis for developing and prioritising an ICT strategy. Much of what we consider in this report is applicable to the wider tertiary education sector as it relates to life-long learning. We consider ICT as an enabler that: a) connects education businesses (all types including tertiary education institutions) to learners, their career decisions and their learning, and as well, b) enables those same businesses to run more efficiently. We suggest that these two sets of activities are considered as interconnected parts of the same education or training business ICT strategy.
Resumo:
The decision of Applegarth J in Heartwood Architectural & Joinery Pty Ltd v Redchip Lawyers [2009] QSC 195 (27 July 2009) involved a costs order against solicitors personally. This decision is but one of several recent decisions in which the court has been persuaded that the circumstances justified costs orders against legal practitioners on the indemnity basis. These decisions serve as a reminder to practitioners of their disclosure obligations when seeking any interlocutory relief in an ex parte application. These obligations are now clearly set out in r 14.4 of the Legal Profession (Solicitors) Rule 2007 and r 25 of 2007 Barristers Rule. Inexperience or ignorance will not excuse breaches of the duties owed to the court.