884 resultados para Fully automated
Resumo:
Objectives This prospective study investigated the effects of caffeine ingestion on the extent of adenosine-induced perfusion abnormalities during myocardial perfusion imaging (MPI). Methods Thirty patients with inducible perfusion abnormalities on standard (caffeine-abstinent) adenosine MPI underwent repeat testing with supplementary coffee intake. Baseline and test MPIs were assessed for stress percent defect, rest percent defect, and percent defect reversibility. Plasma levels of caffeine and metabolites were assessed on both occasions and correlated with MPI findings. Results Despite significant increases in caffeine [mean difference 3,106 μg/L (95% CI 2,460 to 3,752 μg/L; P < .001)] and metabolite concentrations over a wide range, there was no statistically significant change in stress percent defect and percent defect reversibility between the baseline and test scans. The increase in caffeine concentration between the baseline and the test phases did not affect percent defect reversibility (average change −0.003 for every 100 μg/L increase; 95% CI −0.17 to 0.16; P = .97). Conclusion There was no significant relationship between the extent of adenosine-induced coronary flow heterogeneity and the serum concentration of caffeine or its principal metabolites. Hence, the stringent requirements for prolonged abstinence from caffeine before adenosine MPI—based on limited studies—appear ill-founded.
Resumo:
The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.
Resumo:
Exceeding the speed limit and driving too fast for the conditions are regularly cited as significant contributing factors in traffic crashes, particularly fatal and serious injury crashes. Despite an extensive body of research highlighting the relationship between increased vehicle speeds and crash risk and severity, speeding remains a pervasive behaviour on Australian roads. The development of effective countermeasures designed to reduce the prevalence of speeding behaviour requires that this behaviour is well understood. The primary aim of this program of research was to develop a better understanding of the influence of drivers’ perceptions and attitudes toward police speed enforcement on speeding behaviour. Study 1 employed focus group discussions with 39 licensed drivers to explore the influence of perceptions relating to specific characteristics of speed enforcement policies and practices on drivers’ attitudes towards speed enforcement. Three primary factors were identified as being most influential: site selection; visibility; and automaticity (i.e., whether the enforcement approach is automated/camera-based or manually operated). Perceptions regarding these enforcement characteristics were found to influence attitudes regarding the perceived legitimacy and transparency of speed enforcement. Moreover, misperceptions regarding speed enforcement policies and practices appeared to also have a substantial impact on attitudes toward speed enforcement, typically in a negative direction. These findings have important implications for road safety given that prior research has suggested that the effectiveness of speed enforcement approaches may be reduced if efforts are perceived by drivers as being illegitimate, such that they do little to encourage voluntary compliance. Study 1 also examined the impact of speed enforcement approaches varying in the degree of visibility and automaticity on self-reported willingness to comply with speed limits. These discussions suggested that all of the examined speed enforcement approaches (see Section 1.5 for more details) generally showed potential to reduce vehicle speeds and encourage compliance with posted speed limits. Nonetheless, participant responses suggested a greater willingness to comply with approaches operated in a highly visible manner, irrespective of the corresponding level of automaticity of the approach. While less visible approaches were typically associated with poorer rates of driver acceptance (e.g., perceived as “sneaky” and “unfair”), participants reported that such approaches would likely encourage long-term and network-wide impacts on their own speeding behaviour, as a function of the increased unpredictability of operations and increased direct (specific deterrence) and vicarious (general deterrence) experiences with punishment. Participants in Study 1 suggested that automated approaches, particularly when operated in a highly visible manner, do little to encourage compliance with speed limits except in the immediate vicinity of the enforcement location. While speed cameras have been criticised on such grounds in the past, such approaches can still have substantial road safety benefits if implemented in high-risk settings. Moreover, site-learning effects associated with automated approaches can also be argued to be a beneficial by-product of enforcement, such that behavioural modifications are achieved even in the absence of actual enforcement. Conversely, manually operated approaches were reported to be associated with more network-wide impacts on behaviour. In addition, the reported acceptance of such methods was high, due to the increased swiftness of punishment, ability for additional illegal driving behaviours to be policed and the salutary influence associated with increased face-to-face contact with authority. Study 2 involved a quantitative survey conducted with 718 licensed Queensland drivers from metropolitan and regional areas. The survey sought to further examine the influence of the visibility and automaticity of operations on self-reported likelihood and duration of compliance. Overall, the results from Study 2 corroborated those of Study 1. All examined approaches were again found to encourage compliance with speed limits, such that all approaches could be considered to be “effective”. Nonetheless, significantly greater self-reported likelihood and duration of compliance was associated with visibly operated approaches, irrespective of the corresponding automaticity of the approach. In addition, the impact of automaticity was influenced by visibility; such that significantly greater self-reported likelihood of compliance was associated with manually operated approaches, but only when they are operated in a less visible fashion. Conversely, manually operated approaches were associated with significantly greater durations of self-reported compliance, but only when they are operated in a highly visible manner. Taken together, the findings from Studies 1 and 2 suggest that enforcement efforts, irrespective of their visibility or automaticity, generally encourage compliance with speed limits. However, the duration of these effects on behaviour upon removal of the enforcement efforts remains questionable and represents an area where current speed enforcement practices could possibly be improved. Overall, it appears that identifying the optimal mix of enforcement operations, implementing them at a sufficient intensity and increasing the unpredictability of enforcement efforts (e.g., greater use of less visible approaches, random scheduling) are critical elements of success. Hierarchical multiple regression analyses were also performed in Study 2 to investigate the punishment-related and attitudinal constructs that influence self-reported frequency of speeding behaviour. The research was based on the theoretical framework of expanded deterrence theory, augmented with three particular attitudinal constructs. Specifically, previous research examining the influence of attitudes on speeding behaviour has typically focussed on attitudes toward speeding behaviour in general only. This research sought to more comprehensively explore the influence of attitudes by also individually measuring and analysing attitudes toward speed enforcement and attitudes toward the appropriateness of speed limits on speeding behaviour. Consistent with previous research, a number of classical and expanded deterrence theory variables were found to significantly predict self-reported frequency of speeding behaviour. Significantly greater speeding behaviour was typically reported by those participants who perceived punishment associated with speeding to be less certain, who reported more frequent use of punishment avoidance strategies and who reported greater direct experiences with punishment. A number of interesting differences in the significant predictors among males and females, as well as younger and older drivers, were reported. Specifically, classical deterrence theory variables appeared most influential on the speeding behaviour of males and younger drivers, while expanded deterrence theory constructs appeared more influential for females. These findings have important implications for the development and implementation of speeding countermeasures. Of the attitudinal factors, significantly greater self-reported frequency of speeding behaviour was reported among participants who held more favourable attitudes toward speeding and who perceived speed limits to be set inappropriately low. Disappointingly, attitudes toward speed enforcement were found to have little influence on reported speeding behaviour, over and above the other deterrence theory and attitudinal constructs. Indeed, the relationship between attitudes toward speed enforcement and self-reported speeding behaviour was completely accounted for by attitudes toward speeding. Nonetheless, the complexity of attitudes toward speed enforcement are not yet fully understood and future research should more comprehensively explore the measurement of this construct. Finally, given the wealth of evidence (both in general and emerging from this program of research) highlighting the association between punishment avoidance and speeding behaviour, Study 2 also sought to investigate the factors that influence the self-reported propensity to use punishment avoidance strategies. A standard multiple regression analysis was conducted for exploratory purposes only. The results revealed that punishment-related and attitudinal factors significantly predicted approximately one fifth of the variance in the dependent variable. The perceived ability to avoid punishment, vicarious punishment experience, vicarious punishment avoidance and attitudes toward speeding were all significant predictors. Future research should examine these relationships more thoroughly and identify additional influential factors. In summary, the current program of research has a number of implications for road safety and speed enforcement policy and practice decision-making. The research highlights a number of potential avenues for the improvement of public education regarding enforcement efforts and provides a number of insights into punishment avoidance behaviours. In addition, the research adds strength to the argument that enforcement approaches should not only demonstrate effectiveness in achieving key road safety objectives, such as reduced vehicle speeds and associated crashes, but also strive to be transparent and legitimate, such that voluntary compliance is encouraged. A number of potential strategies are discussed (e.g., point-to-point speed cameras, intelligent speed adaptation. The correct mix and intensity of enforcement approaches appears critical for achieving optimum effectiveness from enforcement efforts, as well as enhancements in the unpredictability of operations and swiftness of punishment. Achievement of these goals should increase both the general and specific deterrent effects associated with enforcement through an increased perceived risk of detection and a more balanced exposure to punishment and punishment avoidance experiences.
Resumo:
Proving security of cryptographic schemes, which normally are short algorithms, has been known to be time-consuming and easy to get wrong. Using computers to analyse their security can help to solve the problem. This thesis focuses on methods of using computers to verify security of such schemes in cryptographic models. The contributions of this thesis to automated security proofs of cryptographic schemes can be divided into two groups: indirect and direct techniques. Regarding indirect ones, we propose a technique to verify the security of public-key-based key exchange protocols. Security of such protocols has been able to be proved automatically using an existing tool, but in a noncryptographic model. We show that under some conditions, security in that non-cryptographic model implies security in a common cryptographic one, the Bellare-Rogaway model [11]. The implication enables one to use that existing tool, which was designed to work with a different type of model, in order to achieve security proofs of public-key-based key exchange protocols in a cryptographic model. For direct techniques, we have two contributions. The first is a tool to verify Diffie-Hellmanbased key exchange protocols. In that work, we design a simple programming language for specifying Diffie-Hellman-based key exchange algorithms. The language has a semantics based on a cryptographic model, the Bellare-Rogaway model [11]. From the semantics, we build a Hoare-style logic which allows us to reason about the security of a key exchange algorithm, specified as a pair of initiator and responder programs. The other contribution to the direct technique line is on automated proofs for computational indistinguishability. Unlike the two other contributions, this one does not treat a fixed class of protocols. We construct a generic formalism which allows one to model the security problem of a variety of classes of cryptographic schemes as the indistinguishability between two pieces of information. We also design and implement an algorithm for solving indistinguishability problems. Compared to the two other works, this one covers significantly more types of schemes, but consequently, it can verify only weaker forms of security.
Resumo:
Background: Effective self-management of diabetes is essential for the reduction of diabetes-related complications, as global rates of diabetes escalate. Methods: Randomised controlled trial. Adults with type 2 diabetes (n = 120), with HbA1c greater than or equal to 7.5 %, were randomly allocated (4 × 4 block randomised block design) to receive an automated, interactive telephone-delivered management intervention or usual routine care. Baseline sociodemographic, behavioural and medical history data were collected by self-administered questionnaires and biological data were obtained during hospital appointments. Health-related quality of life (HRQL) was measured using the SF-36. Results: The mean age of participants was 57.4 (SD 8.3), 63 % of whom were male. There were no differences in demographic, socioeconomic and behavioural variables between the study arms at baseline. Over the six-month period from baseline, participants receiving the Australian TLC (Telephone-Linked Care) Diabetes program showed a 0.8 % decrease in geometric mean HbA1c from 8.7 % to 7.9 %, compared with a 0.2 % HbA1c reduction (8.9 % to 8.7 %) in the usual care arm (p = 0.002). There was also a significant improvement in mental HRQL, with a mean increase of 1.9 in the intervention arm, while the usual care arm decreased by 0.8 (p = 0.007). No significant improvements in physical HRQL were observed. Conclusions: These analyses indicate the efficacy of the Australian TLC Diabetes program with clinically significant post-intervention improvements in both glycaemic control and mental HRQL. These observed improvements, if supported and maintained by an ongoing program such as this, could significantly reduce diabetes-related complications in the longer term. Given the accessibility and feasibility of this kind of program, it has strong potential for providing effective, ongoing support to many individuals with diabetes in the future.
Resumo:
In Australia, speeding remains a substantial contributor to road trauma. The National Road Safety Strategy (2011-2020) highlighted the need to harness community support for current and future speed management strategies. Australia is known for intensive speed camera programs which are both automated and manual, employing covert and overt methods. Recent developments in the area of automated speed enforcement in Australia help to illustrate the important link between community attitudes to speed enforcement and subsequent speed camera policy developments. A perceived lack of community confidence in camera programs prompted reviews in New South Wales and Victoria in 2011 by the jurisdictional Auditor-General. This paper explores automated speed camera enforcement in Australia with particular reference to the findings of these two reports as they relate to the level of public support for and community attitudes towards automated speed enforcement. It also provides comment on the evolving nature of automated speed enforcement according to previously identified controversies and dilemmas associated with speed camera programs.
Resumo:
A comprehensive one-dimensional meanline design approach for radial inflow turbines is described in the present work. An original code was developed in Python that takes a novel approach to the automatic selection of feasible machines based on pre-defined performance or geometry characteristics for a given application. It comprises a brute-force search algorithm that traverses the entire search space based on key non-dimensional parameters and rotational speed. In this study, an in-depth analysis and subsequent implementation of relevant loss models as well as selection criteria for radial inflow turbines is addressed. Comparison with previously published designs, as well as other available codes, showed good agreement. Sample (real and theoretical) test cases were trialed and results showed good agreement when compared to other available codes. The presented approach was found to be valid and the model was found to be a useful tool with regards to the preliminary design and performance estimation of radial inflow turbines, enabling its integration with other thermodynamic cycle analysis and three-dimensional blade design codes.
Resumo:
This paper describes in detail our Security-Critical Program Analyser (SCPA). SCPA is used to assess the security of a given program based on its design or source code with regard to data flow-based metrics. Furthermore, it allows software developers to generate a UML-like class diagram of their program and annotate its confidential classes, methods and attributes. SCPA is also capable of producing Java source code for the generated design of a given program. This source code can then be compiled and the resulting Java bytecode program can be used by the tool to assess the program's overall security based on our security metrics.
Resumo:
Due to increased complexity, scale, and functionality of information and telecommunication (IT) infrastructures, every day new exploits and vulnerabilities are discovered. These vulnerabilities are most of the time used by ma¬licious people to penetrate these IT infrastructures for mainly disrupting business or stealing intellectual pro¬perties. Current incidents prove that it is not sufficient anymore to perform manual security tests of the IT infra¬structure based on sporadic security audits. Instead net¬works should be continuously tested against possible attacks. In this paper we present current results and challenges towards realizing automated and scalable solutions to identify possible attack scenarios in an IT in¬frastructure. Namely, we define an extensible frame¬work which uses public vulnerability databases to identify pro¬bable multi-step attacks in an IT infrastructure, and pro¬vide recommendations in the form of patching strategies, topology changes, and configuration updates.
Resumo:
QUT’s new metadata repository (data registry), Research Data Finder, has been designed to promote the visibility and discoverability of QUT research datasets. Funded by the Australian National Data Service (ANDS), it will provide a qualitative snapshot of research data outputs created or collected by members of the QUT research community that are available via open or mediated access. As a fully integrated metadata repository Research Data Finder aligns with institutional sources of truth, such as QUT’s research administrative system, ResearchMaster, as well as QUT’s Academic Profiles system to provide high quality data descriptions that increase awareness of, and access to, shareable research data. In addition, the repository and its workflows are designed to foster smoother data management practices, enhance opportunities for collaboration and research, promote cross-disciplinary research and maximize existing research datasets. The metadata schema used in Research Data Finder is the Registry Interchange Format - Collections and Services (RIF-CS), developed by ANDS in 2009. This comprehensive schema is potentially complex for researchers; unlike metadata for publications, which are often made publicly available with the official publication, metadata for datasets are not typically available and need to be created. Research Data Finder uses a hybrid self-deposit and mediated deposit system. In addition to automated ingests from ResearchMaster (research project information) and Academic Profiles system (researcher information), shareable data is identified at a number of key “trigger points” in the research cycle. These include: research grant proposals; ethics applications; Data Management Plans; Liaison Librarian data interviews; and thesis submissions. These ingested records can be supplemented with related metadata including links to related publications, such as those in QUT ePrints. Records deposited in Research Data Finder are harvested by ANDS and made available to a national and international audience via Research Data Australia, ANDS’ discovery service for Australian research data. Researcher and research group metadata records are also harvested by the National Library of Australia (NLA) and these records are then published in Trove (the NLA’s digital information portal). By contributing records to the national infrastructure, QUT data will become more visible. Within Australia and internationally, many funding bodies have already mandated the open access of publications produced from publicly funded research projects, such as those supported by the Australian Research Council (ARC), or the National Health and Medical Research Council (NHMRC). QUT will be well placed to respond to the rapidly evolving climate of research data management. This project is supported by the Australian National Data Service (ANDS). ANDS is supported by the Australian Government through the National Collaborative Research Infrastructure Strategy Program and the Education Investment Fund (EIF) Super Science Initiative.
Resumo:
This paper proposes a concrete approach for the automatic mitigation of risks that are detected during process enactment. Given a process model exposed to risks, e.g. a financial process exposed to the risk of approval fraud, we enact this process and as soon as the likelihood of the associated risk(s) is no longer tolerable, we generate a set of possible mitigation actions to reduce the risks' likelihood, ideally annulling the risks altogether. A mitigation action is a sequence of controlled changes applied to the running process instance, taking into account a snapshot of the process resources and data, and the current status of the system in which the process is executed. These actions are proposed as recommendations to help process administrators mitigate process-related risks as soon as they arise. The approach has been implemented in the YAWL environment and its performance evaluated. The results show that it is possible to mitigate process-related risks within a few minutes.
Resumo:
This study assessed the workday step counts of lower active (<10,000 daily steps) university employees using an automated, web-based walking intervention (Walk@Work). METHODS: Academic and administrative staff (n=390; 45.6±10.8years; BMI 27.2±5.5kg/m2; 290 women) at five campuses (Australia [x2], Canada, Northern Ireland and the United States), were given a pedometer, access to the website program (2010-11) and tasked with increasing workday walking by 1000 daily steps above baseline, every two weeks, over a six week period. Step count changes at four weeks post intervention were evaluated relative to campus and baseline walking. RESULTS: Across the sample, step counts significantly increased from baseline to post-intervention (1477 daily steps; p=0.001). Variations in increases were evident between campuses (largest difference of 870 daily steps; p=0.04) and for baseline activity status. Those least active at baseline (<5000 daily steps; n=125) increased step counts the most (1837 daily steps; p=0.001), whereas those most active (7500-9999 daily steps; n=79) increased the least (929 daily steps; p=0.001). CONCLUSIONS: Walk@Work increased workday walking by 25% in this sample overall. Increases occurred through an automated program, at campuses in different countries, and were most evident for those most in need of intervention.
Resumo:
Automated process discovery techniques aim at extracting models from information system logs in order to shed light into the business processes supported by these systems. Existing techniques in this space are effective when applied to relatively small or regular logs, but otherwise generate large and spaghetti-like models. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. The result is a collection of process models -- each one representing a variant of the business process -- as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically by means of subprocess extraction. The proposed technique allows users to set a desired bound for the complexity of the produced models. Experiments on real-life logs show that the technique produces collections of models that are up to 64% smaller than those extracted under the same complexity bounds by applying existing trace clustering techniques.
Resumo:
This article reports on the design and implementation of a Computer-Aided Die Design System (CADDS) for sheet-metal blanks. The system is designed by considering several factors, such as the complexity of blank geometry, reduction in scrap material, production requirements, availability of press equipment and standard parts, punch profile complexity, and tool elements manufacturing method. The interaction among these parameters and how they affect designers' decision patterns is described. The system is implemented by interfacing AutoCAD with the higher level languages FORTRAN 77 and AutoLISP. A database of standard die elements is created by parametric programming, which is an enhanced feature of AutoCAD. The greatest advantage achieved by the system is the rapid generation of the most efficient strip and die layouts, including information about the tool configuration.
Resumo:
Introduction: Participants may respond to phases of a workplace walking program at different rates. This study evaluated the factors that contribute to the number of steps through phases of the program. The intervention was automated through a web-based program designed to increase workday walking. Methods: The study reviewed independent variable influences throughout phases I–III. A convenience sample of university workers (n=56; 43.6±1.7 years; BMI 27.44±.2.15 kg/m2; 48 female) were recruited at worksites in Australia. These workers were given a pedometer (Yamax SW 200) and access to the website program. For analyses, step counts entered by workers into the website were downloaded and mean workday steps were compared using a seemingly unrelated regression. This model was employed to capture the contemporaneous correlation within individuals in the study across observed time periods. Results: The model predicts that the 36 subjects with complete information took an average 7460 steps in the baseline two week period. After phase I, statistically significance increases in steps (from baseline) were explained by age, working status (full or part time), occupation (academic or professional), and self reported public transport (PT) use (marginally significant). Full time workers walked more than part time workers by about 440 steps, professionals walked about 300 steps more than academics, and PT users walked about 400 steps more than non-PT users. The ability to differentiate steps after two weeks among participants suggests a differential affect of the program after only two weeks. On average participants increased steps from week two to four by about 525 steps, but regular auto users had nearly 750 steps less than non-auto users at week four. The effect of age was diminished in the 4th week of observation and accounted for 34 steps per year of age. In phase III, discriminating between participants became more difficult, with only age effects differentiating their increase over baseline. The marginal effect of age by phase III compared to phase I, increased from 36 to 50, suggesting a 14 step per year increase from the 2nd to 6th week. Discussion: The findings suggest that participants responded to the program at different rates, with uniformity of effect achieved by the 6th week. Participants increased steps, however a tapering off occurred over time. Age played the most consistent role in predicting steps over the program. PT use was associated with increased step counts, while Auto use was associated with decreased step counts.