999 resultados para election monitoring
Resumo:
This paper examines patterns of political activity and campaigning on Twitter in the context of the 2012 election in the Australian state of Queensland. Social media have been a visible component of political campaigning in Australia at least since the 2007 federal election, with Twitter, in particular, rising to greater prominence in the 2010 federal election. At state level, however, they have remained comparatively less important thus far. In this paper, we track uses of Twitter in the Queensland campaign from its unofficial start in February through to the election day of 24 March 2012. We both examine the overall patterns of activity in the hash tag #qldvotes, and track specific interactions between politicians and other users by following some 80 Twitter accounts of sitting members of parliament and alternative candidates. Such analysis provides new insights into the different approaches to social media campaigning which were embraced by specific candidates and party organisations, as well as an indication of the relative importance of social media activities, at present, for state-level election campaigns.
Resumo:
When organizational scandals occur, the common refrain among commentators is: 'Where was the board in all this?' 'How could the directors not have known what was going on?''Why didn't the board intervene?' The scandals demonstrate that board monitoring or oganizational performance is a matter of great importance. By monitoring, we mean the act of keeping the organization under review. In many English-speaking countries, directors have a legal duty of care, which includes duties to monitor the performance of their organizations (Hopt and von Hippel 2010). However, statutory law typically merely states the duty, while providing little guidance on how that duty can be met.
Resumo:
Advances in mobile telephone technology and available dermoscopic attachments for mobile telephones have created a unique opportunity for consumer-initiated mobile teledermoscopy. At least 2 companies market a dermoscope attachment for an iPhone (Apple), forming a mobile teledermoscope. These devices and the corresponding software applications (apps) enable (1) lesion magnification (at least ×20) and visualization with polarized light; (2) photographic documentation using the telephone camera; (3) lesion measurement (ruler); (4) adding of image and lesion details; and (5) e-mail data to a teledermatologist for review. For lesion assessment, the asymmetry-color (AC) rule has 94% sensitivity and 62 specificity for melanoma identification by consumers [1]. Thus, consumers can be educated to recognize asymmetry and color patterns in suspect lesions. However, we know little about consumers' use of mobile teledermoscopy for lesion assessment.
Resumo:
Remote monitoring for heart failure has been evaluated in numerous systematic reviews. The aim of this meta-review was to appraise their quality and synthesise results. We electronically searched online databases, performed a forward citation search and hand-searched bibliographies. Systematic reviews of remote monitoring interventions that were used for surveillance of heart failure patients were included. Seven (41%) systematic reviews pooled results for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Five (29%) focused on telemonitoring. Four (24%) included both non-invasive and invasive technologies. According to AMSTAR criteria, ten (58%) systematic reviews were of poor methodological quality. In high quality reviews, the relative risk of mortality in patients who received remote monitoring ranged from 0.53 (95% CI=0.29-0.96) to 0.88 (95% CI=0.76-1.01). High quality reviews also reported that remote monitoring reduced the relative risk of all-cause (0.52; 95% CI=0.28-0.96 to 0.96; 95% CI=0.90–1.03) and heart failure-related hospitalizations (0.72; 95% CI=0.64–0.81 to RR 0.79; 95% CI=0.67-0.94) and, as a consequence, healthcare costs. As the high quality reviews reported that remote monitoring reduced hospitalizations, mortality and healthcare costs, research efforts should now be directed towards optimising these interventions in preparation for more widespread implementation.
Resumo:
Background: Procedural sedation and analgesia (PSA) administered by nurses in the cardiac catheterisation laboratory (CCL) is unlikely to yield serious complications. However, the safety of this practice is dependent on timely identification and treatment of depressed respiratory function. Aim: Describe respiratory monitoring in the CCL. Methods: Retrospective medical record audit of adult patients who underwent a procedure in the CCLs of one private hospital in Brisbane during May and June 2010. An electronic database was used to identify subjects and an audit tool ensured data collection was standardised. Results: Nurses administered PSA during 172/473 (37%) procedures including coronary angiographies, percutaneous coronary interventions, electrophysiology studies, radiofrequency ablations, cardiac pacemakers, implantable cardioverter defibrillators, temporary pacing leads and peripheral vascular interventions. Oxygen saturations were recorded during 160/172 (23%) procedures, respiration rate was recorded during 17/172 (10%) procedures, use of oxygen supplementation was recorded during 40/172 (23%) procedures and 13/172 (7.5%; 95% CI=3.59–11.41%) patients experienced oxygen desaturation. Conclusion: Although oxygen saturation was routinely documented, nurses did not regularly record respiration observations. It is likely that surgical draping and the requirement to minimise radiation exposure interfered with nurses’ ability to observe respiration. Capnography could overcome these barriers to respiration assessment as its accurate measurement of exhaled carbon dioxide coupled with the easily interpretable waveform output it produces, which displays a breath-by-breath account of ventilation, enables identification of respiratory depression in real-time. Results of this audit emphasise the need to ascertain the clinical benefits associated with using capnography to assess ventilation during PSA in the CCL.
Resumo:
Background/aims: Remote monitoring for heart failure has not only been evaluated in a large number of randomised controlled trials, but also in many systematic reviews and meta-analyses. The aim of this meta-review was to identify, appraise and synthesise existing systematic reviews that have evaluated the effects of remote monitoring in heart failure. Methods: Using a Cochrane methodology, we electronically searched all relevant online databases and search engines, performed a forward citation search as well as hand-searched bibliographies. Only fully published systematic reviews of invasive and/or non-invasive remote monitoring interventions were included. Two reviewers independently extracted data. Results: Sixty-five publications from 3333 citations were identified. Seventeen fulfilled the inclusion and exclusion criteria. Quality varied with A Measurement Tool to Assess Systematic Reviews (AMSTAR scores) ranging from 2 to 11 (mean 5.88). Seven reviews (41%) pooled results from individual studies for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Four (24%) focused specifically on telemonitoring. Four (24%) included studies investigating both non-invasive and invasive technologies. Population characteristics of the included studies were not reported consistently. Mortality and hospitalisations were the most frequently reported outcomes 12 (70%). Only five reviews (29%) reported healthcare costs and compliance. A high degree of heterogeneity was reported in many of the meta-analyses. Conclusions: These results should be considered in context of two negative RCTs of remote monitoring for heart failure that have been published since the meta-analyses (TIM-HF and Tele-HF). However, high quality reviews demonstrated improved mortality, quality of life, reduction in hospitalisations and healthcare costs.
Resumo:
The invention relates to a method for monitoring user activity on a mobile device, comprising an input and an output unit, comprising the following steps preferably in the following order: detecting and / or logging user activity on said input unit, identifying a foreground running application, hashing of a user-interface-element management list of the foreground running application, and creating a screenshot comprising items displayed on said input unit. The invention also relates to a method for analyzing user activity at a server, comprising the following step: obtaining at least one of an information about detected and / or logged user activity, an information about a foreground running application, a hashed user-interface-element management list and a screenshot from a mobile device. Further, a computer program product is provided, comprising one or more computer readable media having computer executable instructions for performing the steps of at least one of the aforementioned methods.
Resumo:
Increases in functionality, power and intelligence of modern engineered systems led to complex systems with a large number of interconnected dynamic subsystems. In such machines, faults in one subsystem can cascade and affect the behavior of numerous other subsystems. This complicates the traditional fault monitoring procedures because of the need to train models of the faults that the monitoring system needs to detect and recognize. Unavoidable design defects, quality variations and different usage patterns make it infeasible to foresee all possible faults, resulting in limited diagnostic coverage that can only deal with previously anticipated and modeled failures. This leads to missed detections and costly blind swapping of acceptable components because of one’s inability to accurately isolate the source of previously unseen anomalies. To circumvent these difficulties, a new paradigm for diagnostic systems is proposed and discussed in this paper. Its feasibility is demonstrated through application examples in automotive engine diagnostics.
Resumo:
Operational modal analysis (OMA) is prevalent in modal identifi cation of civil structures. It asks for response measurements of the underlying structure under ambient loads. A valid OMA method requires the excitation be white noise in time and space. Although there are numerous applications of OMA in the literature, few have investigated the statistical distribution of a measurement and the infl uence of such randomness to modal identifi cation. This research has attempted modifi ed kurtosis to evaluate the statistical distribution of raw measurement data. In addition, a windowing strategy employing this index has been proposed to select quality datasets. In order to demonstrate how the data selection strategy works, the ambient vibration measurements of a laboratory bridge model and a real cable-stayed bridge have been respectively considered. The analysis incorporated with frequency domain decomposition (FDD) as the target OMA approach for modal identifi cation. The modal identifi cation results using the data segments with different randomness have been compared. The discrepancy in FDD spectra of the results indicates that, in order to fulfi l the assumption of an OMA method, special care shall be taken in processing a long vibration measurement data. The proposed data selection strategy is easy-to-apply and verifi ed effective in modal analysis.
Resumo:
This thesis explored the development of statistical methods to support the monitoring and improvement in quality of treatment delivered to patients undergoing coronary angioplasty procedures. To achieve this goal, a suite of outcome measures was identified to characterise performance of the service, statistical tools were developed to monitor the various indicators and measures to strengthen governance processes were implemented and validated. Although this work focused on pursuit of these aims in the context of a an angioplasty service located at a single clinical site, development of the tools and techniques was undertaken mindful of the potential application to other clinical specialties and a wider, potentially national, scope.
Resumo:
This thesis represents a major step forward in understanding the link between the development of combustion related faults in diesel engines and the generation of acoustic emissions. The findings presented throughout the thesis provide a foundation so that future diesel engine monitoring systems are able to more effectively detect and monitor developing faults. In undertaking this research knowledge concerning engine function and relevant failure mechanisms was combined with different modelling methods to generate a framework that was used to effectively identify fault related activity within acoustic emissions recorded from different engines.
Resumo:
Airborne particles have been shown to be associated with a wide range of adverse health effects, which has led to a recent increase in medical research to gain better insight into their health effects. However, accurate evaluation of the exposure-dose-response relationship is highly dependent on the ability to track actual exposure levels of people to airborne particles. This is quite a complex task, particularly in relation to submicrometer and ultrafine particles, which can vary quite significantly in terms of particle surface area and number concentrations. Therefore, suitable monitors that can be worn for measuring personal exposure to these particles are needed. This paper presents an evaluation of the metrological performance of six diffusion charger sensors, NanoTracer (Philips Aerasense) monitors, when measuring particle number and surface area concentrations, as well as particle number distribution mean when compared to reference instruments. Tests in the laboratory (by generating monodisperse and polydisperse aerosols) and in the field (using natural ambient particles) were designed to evaluate the response of these devices under both steady-state and dynamics conditions. Results showed that the NanoTracers performed well when measuring steady state aerosols, however they strongly underestimated actual concentrations during dynamic response testing. The field experiments also showed that, when the majority of the particles were smaller than 20 nm, which occurs during particle formation events in the atmosphere, the NanoTracer underestimated number concentration quite significantly. Even though the NanoTracer can be used for personal monitoring of exposure to ultrafine particles, it also has limitations which need to be considered in order to provide meaningful results.
Resumo:
Robotic systems are increasingly being utilised as fundamental data-gathering tools by scientists, allowing new perspectives and a greater understanding of the planet and its environmental processes. Today's robots are already exploring our deep oceans, tracking harmful algal blooms and pollution spread, monitoring climate variables, and even studying remote volcanoes. This article collates and discusses the significant advancements and applications of marine, terrestrial, and airborne robotic systems developed for environmental monitoring during the last two decades. Emerging research trends for achieving large-scale environmental monitoring are also reviewed, including cooperative robotic teams, robot and wireless sensor network (WSN) interaction, adaptive sampling and model-aided path planning. These trends offer efficient and precise measurement of environmental processes at unprecedented scales that will push the frontiers of robotic and natural sciences.
Resumo:
The Lake Wivenhoe Integrated Wireless Sensor Network is conceptually similar to traditional SCADA monitoring and control approaches. However, it is applied in an open system using wireless devices to monitor processes that affect water quality at both a high spatial and temporal frequency. This monitoring assists scientists to better understand drivers of key processes that influence water quality and provide the operators with an early warning system if below standard water enters the reservoir. Both of these aspects improve the safety and efficient delivery of drinking water to the end users.
Resumo:
Voltammetric techniques have been introduced to monitor the formation of gold nanoparticles produced via the reaction of the amino acid glycyl-L-tyrosine with Au(III) (bromoaurate) in 0.05 M KOH conditions. The alkaline conditions facilitate amino acid binding to Au(III), inhibit the rate of reduction to Au(0), and provide an excellent supporting electrolyte for voltammetric studies. Data obtained revealed that a range of time-dependent gold solution species are involved in gold nanoparticle formation and that the order in which reagents are mixed is critical to the outcome. Concomitantly with voltammetric measurements, the properties of gold nanoparticles formed are probed by examination of electronic spectra in order to understand how the solution environment present during nanoparticle growth affects the final distribution of the nanoparticles. Images obtained by the ex situ transmission electron microscopy (TEM) technique enable the physical properties of the nanoparticles isolated in the solid state to be assessed. Use of this combination of in situ and ex situ techniques provides a versatile framework for elucidating the details of nanoparticle formation.