955 resultados para Interconnected devices network
Resumo:
Two recent decisions of the Supreme Court of New South Wales in the context of obstetric management have highlighted firstly, the importance of keeping legible, accurate and detailed medical records; and secondly, the challenges faced by those seeking to establish causation, particularly where epidemiological evidence is relied upon...
Resumo:
This paper describes a risk model for estimating the likelihood of collisions at low-exposure railway level crossings, demonstrating the effect that differences in safety integrity can have on the likelihood of a collision. The model facilitates the comparison of safety benefits between level crossings with passive controls (stop or give-way signs) and level crossings that have been hypothetically upgraded with conventional or low-cost warning devices. The scenario presented illustrates how treatment of a cross-section of level crossings with low cost devices can provide a greater safety benefit compared to treatment with conventional warning devices for the same budget.
Resumo:
Recently there has been significant interest of researchers and practitioners on the use of Bluetooth as a complementary transport data. However, literature is limited with the understanding of the Bluetooth MAC Scanner (BMS) based data acquisition process and the properties of the data being collected. This paper first provides an insight on the BMS data acquisition process. Thereafter, it discovers the interesting facts from analysis of the real BMS data from both motorway and arterial networks of Brisbane, Australia. The knowledge gained is helpful for researchers and practitioners to understand the BMS data being collected which is vital to the development of management and control algorithms using the data.
Resumo:
BACKGROUND: Transcatheter closure of patent foramen ovale (PFO) has rapidly evolved as the preferred management strategy for the prevention of recurrent cerebrovascular events in patients with cryptogenic stroke and presumed paradoxical embolus. There is limited outcome data in patients treated with this therapy particularly for the newer devices. METHODS: Data from medical records, catheter, and echocardiography databases on 70 PFO procedures performed was collected prospectively. RESULTS: The cohort consisted of 70 patients (mean age 43.6 years, range 19 to 77 years), of whom 51% were male. The indications for closure were cryptogenic cerebrovascular accident (CVA) or transient ischemic attack (TIA) in 64 (91%) and peripheral emboli in two (2.8%) patients and cryptogenic ST-elevation myocardial infarction in one (1.4%), refractory migraine in one (1.4%), decompression sickness in one (1.4%), and orthodeoxia in one (1.4%) patient, respectively. All patients had demonstrated right-to-left shunting on bubble study. The procedures were guided by intracardiac echocardiography in 53%, transesophageal echocardiography in 39%, and the remainder by transthoracic echo alone. Devices used were the Amplatzer PFO Occluder (AGA Medical) (sizes 18-35 mm) in 49 (70%) and the Premere device (St. Jude Medical) in 21 (30%). In-hospital complications consisted of one significant groin hematoma with skin infection. Echocardiographic follow-up at 6 months revealed that most patients had no or trivial residual shunt (98.6%), while one patient (1.4%) had a mild residual shunt. At a median of 11 months' follow-up (range 1 month to 4.3 years), no patients (0%) experienced further CVA/TIAs or paradoxical embolic events during follow-up. CONCLUSION: PFO causing presumed paradoxical embolism can be closed percutaneously with a low rate of significant residual shunting and very few complications. Recurrent index events are uncommon at medium-term (up to 4 years) follow-up.
Resumo:
The Chemistry Discipline Network has recently completed two distinct mapping exercises. The first is a snapshot of chemistry taught at 12 institutions around Australia in 2011. There were many similarities but also important differences in the content taught and assessed at different institutions. There were also significant differences in delivery, particularly laboratory contact hours, as well as forms and weightings of assessment. The second mapping exercise mapped the chemistry degrees at three institutions to the Threshold Learning Outcomes for chemistry. Importantly, some of the TLOs were addressed by multiple units at all institutions, while others were not met, or were met at an introductory level only. The exercise also exposed some challenges in using the TLOs as currently written.
Resumo:
After its narrow re-election in June 2010, the Australian Labor government undertook a series of public inquiries into reform of Australian media, communications and copyright laws. One important driver of policy reform was the government’s commitment to building a National Broadband Network (NBN), and the implications this had for existing broadcasting and telecommunications policy, as it would constitute a major driver of convergence of media and communications access devices and content platforms. These inquiries included: the Convergence Review of media and communications legislation; the Australian Law Reform Commission (ALRC) review of the National Classification Scheme; the Independent Media Inquiry (Finkelstein Review) into Media and Media Regulation; and the ALRC review of Copyright and the Digital Economy. One unusual feature of this review process, discussed in the paper, was the degree to which academics were involved in the process, not simply as providers of expert opinion, but as review chairs seconded from their universities. This paper considers the role played by activist groups in all of these inquiries and their relationship to the various participants in the inquiries, as well as the implications of academics being engaged in such inquiries, not simply as activist-scholars, but as those primarily responsible for delivering policy review outcomes. The latter brings to the forefront issues arising in from direct engagement with governments and state agencies themselves, which challenges traditional understandings of the academic community as “critical outsiders” towards such policy processes.
Resumo:
This article asks questions about the futures of power in the network era. Two critical emerging issues are at work with uncertain outcomes. The first is the emergence of the collaborative economy, while the second is the emergence of surveillance capabilities from both civic, state and commercial sources. While both of these emerging issues are expected by many to play an important role in the future development of our societies, it is still unclear whose values and whose purposes will be furthered. This article argues that the futures of these emerging issues depend on contests for power. As such, four scenarios are developed for the futures of power in the network era using the double variable scenario approach.
Resumo:
Network reconfiguration after complete blackout of a power system is an essential step for power system restoration. A new node importance evaluation method is presented based on the concept of regret, and maximisation of the average importance of a path is employed as the objective of finding the optimal restoration path. Then, a two-stage method is presented to optimise the network reconfiguration strategy. Specifically, the restoration sequence of generating units is first optimised so as to maximise the restored generation capacity, then the optimal restoration path is selected to restore the generating nodes concerned and the issues of selecting a serial or parallel restoration mode and the reconnecting failure of a transmission line are next considered. Both the restoration path selection and skeleton-network determination are implemented together in the proposed method, which overcomes the shortcoming of separate decision-making in the existing methods. Finally, the New England 10-unit 39-bus power system and the Guangzhou power system in South China are employed to demonstrate the basic features of the proposed method.
Resumo:
This practice-led project has two outcomes: a collection of short stories titled 'Corkscrew Section', and an exegesis. The short stories combine written narrative with visual elements such as images and typographic devices, while the exegesis analyses the function of these graphic devices within adult literary fiction. My creative writing explores a variety of genres and literary styles, but almost all of the stories are concerned with fusing verbal and visual modes of communication. The exegesis adopts the interpretive paradigm of multimodal stylistics, which aims to analyse graphic devices with the same level of detail as linguistic analysis. Within this framework, the exegesis compares and extends previous studies to develop a systematic method for analysing how the interactions between language, images and typography create meaning within multimodal literature.
Resumo:
Two longitudinal experiments were conducted exploring emotional experiences with PIDs over six months including media and medial Portable Interactive Devices (PIDs). Results identifying the impact of negative social and personal interactions on the overall emotional experience as well as different task categories (Features, Functional, Mediation and Auxiliary) and their corresponding emotional responses have previously been reported [2,3,4,5]. This paper builds on these findings and presents the Designing for Evolving Emotional Experience (DE3) framework promoting positive (and deals with negative) emotional experiences with PIDs including a set of principles to better understand emotional experiences. To validate the DE3 framework a preliminary trial was conducted with five practicing industrial designers. The trial required them to consider initial design concepts using the DE3 framework followed by a questionnaire asking about their use of the framework for concept development. The trial aimed to analyse the effectiveness, efficiency and usefulness of the framework in assisting in the development of initial concepts for PIDs taking into account emotional experiences. Common themes regarding the framework are outlined including the ease of use, the effectiveness in focusing on the personal and social contexts and positive ratings regarding its use. Overall the feedback from the preliminary trial was encouraging with responses suggesting that the framework was accessible, rated highly and most importantly permitted designers to consider emotional experiences during concept development. The paper concludes with a discussion regarding the future development of the DE3 framework and the potential implications to design theory and the design discipline.
Resumo:
Distributed Wireless Smart Camera (DWSC) network is a special type of Wireless Sensor Network (WSN) that processes captured images in a distributed manner. While image processing on DWSCs sees a great potential for growth, with its applications possessing a vast practical application domain such as security surveillance and health care, it suffers from tremendous constraints. In addition to the limitations of conventional WSNs, image processing on DWSCs requires more computational power, bandwidth and energy that presents significant challenges for large scale deployments. This dissertation has developed a number of algorithms that are highly scalable, portable, energy efficient and performance efficient, with considerations of practical constraints imposed by the hardware and the nature of WSN. More specifically, these algorithms tackle the problems of multi-object tracking and localisation in distributed wireless smart camera net- works and optimal camera configuration determination. Addressing the first problem of multi-object tracking and localisation requires solving a large array of sub-problems. The sub-problems that are discussed in this dissertation are calibration of internal parameters, multi-camera calibration for localisation and object handover for tracking. These topics have been covered extensively in computer vision literatures, however new algorithms must be invented to accommodate the various constraints introduced and required by the DWSC platform. A technique has been developed for the automatic calibration of low-cost cameras which are assumed to be restricted in their freedom of movement to either pan or tilt movements. Camera internal parameters, including focal length, principal point, lens distortion parameter and the angle and axis of rotation, can be recovered from a minimum set of two images of the camera, provided that the axis of rotation between the two images goes through the camera's optical centre and is parallel to either the vertical (panning) or horizontal (tilting) axis of the image. For object localisation, a novel approach has been developed for the calibration of a network of non-overlapping DWSCs in terms of their ground plane homographies, which can then be used for localising objects. In the proposed approach, a robot travels through the camera network while updating its position in a global coordinate frame, which it broadcasts to the cameras. The cameras use this, along with the image plane location of the robot, to compute a mapping from their image planes to the global coordinate frame. This is combined with an occupancy map generated by the robot during the mapping process to localised objects moving within the network. In addition, to deal with the problem of object handover between DWSCs of non-overlapping fields of view, a highly-scalable, distributed protocol has been designed. Cameras that follow the proposed protocol transmit object descriptions to a selected set of neighbours that are determined using a predictive forwarding strategy. The received descriptions are then matched at the subsequent camera on the object's path using a probability maximisation process with locally generated descriptions. The second problem of camera placement emerges naturally when these pervasive devices are put into real use. The locations, orientations, lens types etc. of the cameras must be chosen in a way that the utility of the network is maximised (e.g. maximum coverage) while user requirements are met. To deal with this, a statistical formulation of the problem of determining optimal camera configurations has been introduced and a Trans-Dimensional Simulated Annealing (TDSA) algorithm has been proposed to effectively solve the problem.
Resumo:
Scaffolding is an essential issue in tissue engineering and scaffolds should answer certain essential criteria: biocompatibility, high porosity, and important pore interconnectivity to facilitate cell migration and fluid diffusion. In this work, a modified solvent castingparticulate leaching out method is presented to produce scaffolds with spherical and interconnected pores. Sugar particles (200–300 lm and 300–500 lm) were poured through a horizontal Meker burner flame and collected below the flame. While crossing the high temperature zone, the particles melted and adopted a spherical shape. Spherical particles were compressed in plastic mold. Then, poly-L-lactic acid solution was cast in the sugar assembly. After solvent evaporation, the sugar was removed by immersing the structure into distilled water for 3 days. The obtained scaffolds presented highly spherical interconnected pores, with interconnection pathways from 10 to 100 lm. Pore interconnection was obtained without any additional step. Compression tests were carried out to evaluate the scaffold mechanical performances. Moreover, rabbit bone marrow mesenchymal stem cells were found to adhere and to proliferate in vitro in the scaffold over 21 days. This technique produced scaffold with highly spherical and interconnected pores without the use of additional organic solvents to leach out the porogen.
Resumo:
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
Resumo:
The use of mobile devices and social media technologies are becoming all-pervasive in society: they are both transformative and constant. The high levels of mobile device ownership and increased access to social media technologies enables the potential for ‘anytime, anywhere’ cooperation and collaboration in education. While recent reports into emerging technologies in higher education predict an increase in the use of mobile devices and social media technologies (Horizon Report, 2013), there is a lack of theory-based research to indicate how these technologies can be most effectively harnessed to support and enhance student learning and what the impacts of these technologies are on both students and educators. In response to the need to understand how these technologies can be better embraced within higher education, this study investigated how first year education students used mobile devices and social media technologies. More specifically, the study identified how students spent most of their time when connected online with mobile devices and social media technologies and whether the online connected time engaged them in their learning or whether it was a distraction.