797 resultados para Data-communication
Resumo:
A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.
Resumo:
Our paper approaches Twitter through the lens of “platform politics” (Gillespie, 2010), focusing in particular on controversies around user data access, ownership, and control. We characterise different actors in the Twitter data ecosystem: private and institutional end users of Twitter, commercial data resellers such as Gnip and DataSift, data scientists, and finally Twitter, Inc. itself; and describe their conflicting interests. We furthermore study Twitter’s Terms of Service and application programming interface (API) as material instantiations of regulatory instruments used by the platform provider and argue for a more promotion of data rights and literacy to strengthen the position of end users.
Resumo:
The deployment of new emerging technologies, such as cooperative systems, allows the traffic community to foresee relevant improvements in terms of traffic safety and efficiency. Vehicles are able to communicate on the local traffic state in real time, which could result in an automatic and therefore better reaction to the mechanism of traffic jam formation. An upstream single hop radio broadcast network can improve the perception of each cooperative driver within radio range and hence the traffic stability. The impact of a cooperative law on traffic congestion appearance is investigated, analytically and through simulation. Ngsim field data is used to calibrate the Optimal Velocity with Relative Velocity (OVRV) car following model and the MOBIL lane-changing model is implemented. Assuming that congestion can be triggered either by a perturbation in the instability domain or by a critical lane changing behavior, the calibrated car following behavior is used to assess the impact of a microscopic cooperative law on abnormal lane changing behavior. The cooperative law helps reduce and delay traffic congestion as it increases traffic flow stability.
Resumo:
QUT’s new metadata repository (data registry), Research Data Finder, has been designed to promote the visibility and discoverability of QUT research datasets. Funded by the Australian National Data Service (ANDS), it will provide a qualitative snapshot of research data outputs created or collected by members of the QUT research community that are available via open or mediated access. As a fully integrated metadata repository Research Data Finder aligns with institutional sources of truth, such as QUT’s research administrative system, ResearchMaster, as well as QUT’s Academic Profiles system to provide high quality data descriptions that increase awareness of, and access to, shareable research data. In addition, the repository and its workflows are designed to foster smoother data management practices, enhance opportunities for collaboration and research, promote cross-disciplinary research and maximize existing research datasets. The metadata schema used in Research Data Finder is the Registry Interchange Format - Collections and Services (RIF-CS), developed by ANDS in 2009. This comprehensive schema is potentially complex for researchers; unlike metadata for publications, which are often made publicly available with the official publication, metadata for datasets are not typically available and need to be created. Research Data Finder uses a hybrid self-deposit and mediated deposit system. In addition to automated ingests from ResearchMaster (research project information) and Academic Profiles system (researcher information), shareable data is identified at a number of key “trigger points” in the research cycle. These include: research grant proposals; ethics applications; Data Management Plans; Liaison Librarian data interviews; and thesis submissions. These ingested records can be supplemented with related metadata including links to related publications, such as those in QUT ePrints. Records deposited in Research Data Finder are harvested by ANDS and made available to a national and international audience via Research Data Australia, ANDS’ discovery service for Australian research data. Researcher and research group metadata records are also harvested by the National Library of Australia (NLA) and these records are then published in Trove (the NLA’s digital information portal). By contributing records to the national infrastructure, QUT data will become more visible. Within Australia and internationally, many funding bodies have already mandated the open access of publications produced from publicly funded research projects, such as those supported by the Australian Research Council (ARC), or the National Health and Medical Research Council (NHMRC). QUT will be well placed to respond to the rapidly evolving climate of research data management. This project is supported by the Australian National Data Service (ANDS). ANDS is supported by the Australian Government through the National Collaborative Research Infrastructure Strategy Program and the Education Investment Fund (EIF) Super Science Initiative.
Resumo:
Kuwait is an oil rich country planning for a future that is not dependent on exploiting natural resources. A major policy initiative has been the introduction of Information Communication and Technology (ICT) to schools. However, contextual issues and teacher capabilities in the use of ICT have limited the success of this initiative. The study examines the leadership strategies of two secondary school principals whose schools have achieved this goal. The case study draws on intensive data collected through interviews of the principals, and teachers supported by document analysis and observations. Analysis was guided by theoretical perspectives drawn from the literature which identified a range of strategies used by the principals to manage change. The principals of Schools A and B employed three key strategies to maximise the impact on the teaching staff incorporating ICT into their teaching and learning practices. These strategies were: (a) encouragement for teaching staff to implement ICT in their teaching; (b) support to meet the material and human needs of teaching staff using ICT; and (c) provision of instructions and guidance for teaching staff in how and why such behaviours and practices should be performed. The outcome of this study proposes an innovative change leadership model that informs emerging countries, which are also undergoing major change related to ICT. However, the study also revealed limitations in the implementation of ICT in the classroom and provides insights into further strategies that principals need to adopt.
Resumo:
Big data is big news in almost every sector including crisis communication. However, not everyone has access to big data and even if we have access to big data, we often do not have necessary tools to analyze and cross reference such a large data set. Therefore this paper looks at patterns in small data sets that we have ability to collect with our current tools to understand if we can find actionable information from what we already have. We have analyzed 164390 tweets collected during 2011 earthquake to find out what type of location specific information people mention in their tweet and when do they talk about that. Based on our analysis we find that even a small data set that has far less data than a big data set can be useful to find priority disaster specific areas quickly.
Resumo:
This paper describes the implementation of the first portable, embedded data acquisition unit (BabelFuse) that is able to acquire and timestamp generic sensor data and trigger General Purpose I/O (GPIO) events against a microsecond-accurate wirelessly-distributed ‘global’ clock. A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fast-moving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment especially if non-deterministic communication hardware (such as IEEE-802.11-based wireless) and inaccurate clock synchronisation protocols are used. The issue of differing timebases makes correlation of data difficult and prevents the units from reliably performing synchronised operations or manoeuvres. By utilising hardware-assisted timestamping, clock synchronisation protocols based on industry standards and firmware designed to minimise indeterminism, an embedded data acquisition unit capable of microsecond-level clock synchronisation is presented.
Resumo:
Due to the chronic shortages of GPs in Australian rural and remote regions, considerable numbers of international medical graduates (IMG) have been recruited. IMG experience many difficulties when relocating to Australia with one of the most significant being effective GP-patient communication. Given that this is essential for effective consultation it can have a substantial impact on health care. A purposive sample of seven practising GPs (five IMG, two Australian-trained doctors (ATD)) was interviewed using a semistructured face-to-face interviewing technique. GPs from Nigeria, Egypt, United Kingdom, India, Singapore and Australia participated. Interviews were transcribed and then coded. The authors used qualitative thematic analysis of interview transcripts to identify common themes. IMG-patient communication barriers were considered significant in the Wheatbelt region as identified by both IMG and ATD. ATD indicated they were aware of IMG-patient communication issues resulting in subsequent consults with patients to explain results and diagnoses. Significantly, a lack of communication between ATD and IMG also emerged, creating a further barrier to effective communication. Analysis of the data generated several important findings that rural GP networks should consider when integrating new IMG into the community. Addressing the challenges related to cross-cultural differences should be a priority, in order to enable effective communication. More open communication between ATD and IMG about GP-patient communication barriers and education programs around GP-patient communication would help both GP and patient satisfaction.
Resumo:
This project researched the performance of emerging digital technology for high voltage electricity substations that significantly improves safety for staff and reduces the potential impact on the environment of equipment failure. The experimental evaluation used a scale model of a substation control system that incorporated real substation control and networking equipment with real-time simulation of the power system. The outcomes confirm that it is possible to implement Ethernet networks in high voltage substations that meet the needs of utilities; however component-level testing of devices is necessary to achieve this. The assessment results have been used to further develop international standards for substation communication and precision timing.
Resumo:
The contemporary working environment is being rapidly reshaped by technological, industrial and political forces. Increased global competitiveness and an emphasis on productivity have led to the appearance of alternative methods of employment, such as part-time, casual and itinerant work, allowing greater flexibility. This allows for the development of a core permanent staff and the simultaneous utilisation of casual staff according to business needs. Flexible workers across industries are generally referred to as the non-standard workforce and full-time permanent workers as the standard workforce. Even though labour flexibility favours the employer, increased opportunity for flexible work has been embraced by women for many reasons, including the gender struggle for greater economic independence and social equality. Consequently, the largely female nursing industry, both nationally and internationally, has been caught up in this wave of change. This ageing workforce has been at the forefront of the push for flexibility with recent figures showing almost half the nursing workforce is employed in non-standard capacity. In part, this has allowed women to fulfil caring roles outside their work, to ease off nearing retirement and to supplement the family income. More significantly, however, flexibility has developed as an economic management initiative, as a strategy for cost constraint. The result has been the development of a dual workforce and as suggested by Pocock, Buchanan and Campbell (2004), associated deep-seated resentment and the marginalisation of part-time and casual workers by their full-time colleagues and managers. Additionally, as nursing currently faces serious recruitment and retention problems there is urgent need to understand the factors which are underlying present discontent in the nursing profession. There is an identified gap in nursing knowledge surrounding the issues relating to recruitment and retention. Communication involves speaking, listening, reading and writing and is an interactive process which is central to the lives of humans. Workplace communication refers to human interaction, information technology, and multimedia and print. It is the means to relationship building between workers, management, and their external environment and is critical to organisational effectiveness. Communication and language are integral to nursing performance (Hall, 2005), in twenty-four hour service however increasing fragmentation due to part-time and casual work in the nursing industry means that effective communication management has become increasingly difficult. More broadly it is known that disruption to communication systems impacts negatively on consumer outcomes. Because of this gap in understanding how nurses view their contemporary nursing world, an interpretative ethnographic study which progressed to a critical ethnographic study, based on the conceptual framework of constructionism and interpretativism was used. The study site was a division within an acute health care facility, and the relationship between increasing casualisation of the nursing workforce and the experiences of communication of standard and non-standard nurses was explored. For this study, full-time standard nurses were those employed to work in a specific unit for forty hours per week. Non-standard nurses were those employed part-time in specific units or those nurses employed to work as relief pool nurses for shift short falls where needed. Nurses employed by external agencies, but required to fill in for shifts at the facility were excluded from this research. This study involved an analysis of observational, interview and focus group data of standard and non-standard nurses within this facility. Three analytical findings - the organisation of nursing work; constructing the casual nurse as other; and the function of space, situate communication within a broader discussion about non-standard work and organisational culture. The study results suggest that a significant culture of marginalisation exists for nurses who work in a non-standard capacity and that this affects communication for nurses and has implications for the quality of patient care. The discussion draws on the seven elements of marginalisation described by Hall, Stephen and Melius (1994). The arguments propose that these elements underpin a culture which supports remnants of the historically gendered stereotype "the good nurse" and these cultural values contribute to practices and behaviour which marginalise all nurses, particularly those who work less than full-time. Gender inequality is argued to be at the heart of marginalising practices because of long standing subordination of nurses by the powerful medical profession, paralleling historical subordination of women in society. This has denied nurses adequate representation and voice in decision making. The new knowledge emanating from this study extends current knowledge of factors surrounding recruitment and retention and as such contributes to an understanding of the current and complex nursing environment.
Resumo:
Focuses on the various aspects of advances in future information communication technology and its applications Presents the latest issues and progress in the area of future information communication technology Applicable to both researchers and professionals These proceedings are based on the 2013 International Conference on Future Information & Communication Engineering (ICFICE 2013), which will be held at Shenyang in China from June 24-26, 2013. The conference is open to all over the world, and participation from Asia-Pacific region is particularly encouraged. The focus of this conference is on all technical aspects of electronics, information, and communications ICFICE-13 will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of FICE. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in FICE. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. "This work was supported by the NIPA (National IT Industry Promotion Agency) of Korea Grant funded by the Korean Government (Ministry of Science, ICT & Future Planning)."
Resumo:
Electricity cost has become a major expense for running data centers and server consolidation using virtualization technology has been used as an important technology to improve the energy efficiency of data centers. In this research, a genetic algorithm and a simulation-annealing algorithm are proposed for the static virtual machine placement problem that considers the energy consumption in both the servers and the communication network, and a trading algorithm is proposed for dynamic virtual machine placement. Experimental results have shown that the proposed methods are more energy efficient than existing solutions.
Resumo:
Background: Multiple sclerosis (MS) is the most common cause of chronic neurologic disability beginning in early to middle adult life. Results from recent genome-wide association studies (GWAS) have substantially lengthened the list of disease loci and provide convincing evidence supporting a multifactorial and polygenic model of inheritance. Nevertheless, the knowledge of MS genetics remains incomplete, with many risk alleles still to be revealed. Methods: We used a discovery GWAS dataset (8,844 samples, 2,124 cases and 6,720 controls) and a multi-step logistic regression protocol to identify novel genetic associations. The emerging genetic profile included 350 independent markers and was used to calculate and estimate the cumulative genetic risk in an independent validation dataset (3,606 samples). Analysis of covariance (ANCOVA) was implemented to compare clinical characteristics of individuals with various degrees of genetic risk. Gene ontology and pathway enrichment analysis was done using the DAVID functional annotation tool, the GO Tree Machine, and the Pathway-Express profiling tool. Results: In the discovery dataset, the median cumulative genetic risk (P-Hat) was 0.903 and 0.007 in the case and control groups, respectively, together with 79.9% classification sensitivity and 95.8% specificity. The identified profile shows a significant enrichment of genes involved in the immune response, cell adhesion, cell communication/ signaling, nervous system development, and neuronal signaling, including ionotropic glutamate receptors, which have been implicated in the pathological mechanism driving neurodegeneration. In the validation dataset, the median cumulative genetic risk was 0.59 and 0.32 in the case and control groups, respectively, with classification sensitivity 62.3% and specificity 75.9%. No differences in disease progression or T2-lesion volumes were observed among four levels of predicted genetic risk groups (high, medium, low, misclassified). On the other hand, a significant difference (F = 2.75, P = 0.04) was detected for age of disease onset between the affected misclassified as controls (mean = 36 years) and the other three groups (high, 33.5 years; medium, 33.4 years; low, 33.1 years). Conclusions: The results are consistent with the polygenic model of inheritance. The cumulative genetic risk established using currently available genome-wide association data provides important insights into disease heterogeneity and completeness of current knowledge in MS genetics.
Resumo:
Literature is limited in its knowledge of the Bluetooth protocol based data acquisition process and in the accuracy and reliability of the analysis performed using the data. This paper extends the body of knowledge surrounding the use of data from the Bluetooth Media Access Control Scanner (BMS) as a complementary traffic data source. A multi layer simulation model named Traffic and Communication Simulation (TCS) is developed. TCS is utilised to model the theoretical properties of the BMS data and analyse the accuracy and reliability of travel time estimation using the BMS data.
Resumo:
A significant amount of speech data is required to develop a robust speaker verification system, but it is difficult to find enough development speech to match all expected conditions. In this paper we introduce a new approach to Gaussian probabilistic linear discriminant analysis (GPLDA) to estimate reliable model parameters as a linearly weighted model taking more input from the large volume of available telephone data and smaller proportional input from limited microphone data. In comparison to a traditional pooled training approach, where the GPLDA model is trained over both telephone and microphone speech, this linear-weighted GPLDA approach is shown to provide better EER and DCF performance in microphone and mixed conditions in both the NIST 2008 and NIST 2010 evaluation corpora. Based upon these results, we believe that linear-weighted GPLDA will provide a better approach than pooled GPLDA, allowing for the further improvement of GPLDA speaker verification in conditions with limited development data.