349 resultados para Gallardo Clark


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To determine whether primary care management of chronic heart failure (CHF) differed between rural and urban areas in Australia. Design: A cross-sectional survey stratified by Rural, Remote and Metropolitan Areas (RRMA) classification. The primary source of data was the Cardiac Awareness Survey and Evaluation (CASE) study. Setting: Secondary analysis of data obtained from 341 Australian general practitioners and 23 845 adults aged 60 years or more in 1998. Main outcome measures: CHF determined by criteria recommended by the World Health Organization, diagnostic practices, use of pharmacotherapy, and CHF-related hospital admissions in the 12 months before the study. Results: There was a significantly higher prevalence of CHF among general practice patients in large and small rural towns (16.1%) compared with capital city and metropolitan areas (12.4%) (P < 0.001). Echocardiography was used less often for diagnosis in rural towns compared with metropolitan areas (52.0% v 67.3%, P < 0.001). Rates of specialist referral were also significantly lower in rural towns than in metropolitan areas (59.1% v 69.6%, P < 0.001), as were prescribing rates of angiotensin-converting enzyme inhibitors (51.4% v 60.1%, P < 0.001). There was no geographical variation in prescribing rates of β-blockers (12.6% [rural] v 11.8% [metropolitan], P = 0.32). Overall, few survey participants received recommended “evidence-based practice” diagnosis and management for CHF (metropolitan, 4.6%; rural, 3.9%; and remote areas, 3.7%). Conclusions: This study found a higher prevalence of CHF, and significantly lower use of recommended diagnostic methods and pharmacological treatment among patients in rural areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The construction of timelines of computer activity is a part of many digital investigations. These timelines of events are composed of traces of historical activity drawn from system logs and potentially from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work introduces a software tool (CAT Detect) for the detection of inconsistency within timelines of computer activity. We examine the impact of deliberate tampering through experiments conducted with our prototype software tool. Based on the results of these experiments, we discuss techniques which can be employed to deal with such temporal inconsistencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In late 2009, Health Libraries Australia (HLA) received a small grant to undertake a national research project to determine the future requirements for health librarians in the workforce in Australia and develop a structured, modular education framework (post-graduate qualification and continuing professional development structure) to meet these requirements. The main objective was to consider the education and professional development framework that would ensure that health librarians have a clearly defined scope of practice and the specific competency based knowledge and skills that enable them to contribute to the design and delivery of high quality health services in this country. The final report presents a detailed discussion of the changing Australian healthcare environment and the resulting impact on the health library sector, as well as an overview of international trends in health libraries and the implications for Australian health librarianship education. The research methodology is outlined, followed by an analysis of the findings from the two surveys with health librarians and health library managers and the semi-structured interviews conducted with employers. The Medical Library Association (MLA) in the United States had developed a policy document detailing the competencies required by health librarians. It was found that the MLA competencies represented an accepted professional framework of skills which could be used objectively in the survey instrument to measure the areas of professional knowledge and responsibilities that were relevant in the current workplace, and to identify how these requirements might change in the next three to five years. The research results underscore the imperative for health librarians to engage in regular, relevant professional development activities that will enable them to stay abreast with the rapid contextual changes impacting on their practice. In order to be accepted as key members of the multi-disciplinary health professional team, it is strongly believed that health librarians should commit to establishing the mechanisms for specialist certification maintained through compulsory CPD in an ongoing three-year cycle of revalidation. This development would align ALIA and health librarians with other health sector professional associations which are responsible for the self regulation of entry to and continuation in their profession.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuous user authentication with keystroke dynamics uses characters sequences as features. Since users can type characters in any order, it is imperative to find character sequences (n-graphs) that are representative of user typing behavior. The contemporary feature selection approaches do not guarantee selecting frequently-typed features which may cause less accurate statistical user-representation. Furthermore, the selected features do not inherently reflect user typing behavior. We propose four statistical based feature selection techniques that mitigate limitations of existing approaches. The first technique selects the most frequently occurring features. The other three consider different user typing behaviors by selecting: n-graphs that are typed quickly; n-graphs that are typed with consistent time; and n-graphs that have large time variance among users. We use Gunetti’s keystroke dataset and k-means clustering algorithm for our experiments. The results show that among the proposed techniques, the most-frequent feature selection technique can effectively find user representative features. We further substantiate our results by comparing the most-frequent feature selection technique with three existing approaches (popular Italian words, common n-graphs, and least frequent ngraphs). We find that it performs better than the existing approaches after selecting a certain number of most-frequent n-graphs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Goldin (2003) and McDonald, Yanchar, and Osguthorpe (2005) have called for mathematics learning theory that reconciles the chasm between ideologies, and which may advance mathematics teaching and learning practice. This paper discusses the theoretical underpinnings of a recently completed PhD study that draws upon Popper’s (1978) three-world model of knowledge as a lens through which to reconsider a variety of learning theories, including Piaget’s reflective abstraction. Based upon this consideration of theories, an alternative theoretical framework and complementary operational model was synthesised, the viability of which was demonstrated by its use to analyse the domain of early-number counting, addition and subtraction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports an investigation of primary school children’s understandings about "square". 12 students participated in a small group teaching experiment session, where they were interviewed and guided to construct a square in a 3D virtual reality learning environment (VRLE). Main findings include mixed levels of "quasi" geometrical understandings, misconceptions about length and angles, and ambiguous uses of geometrical language for location, direction, and movement. These have implications for future teaching and learning about 2D shapes with particular reference to VRLE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We read the excellent review of telemonitoring in chronic heart failure (CHF)1 with interest and commend the authors on the proposed classification of telemedical remote management systems according to the type of data transfer, decision ability and level of integration. However, several points require clarification in relation to our Cochrane review of telemonitoring and structured telephone support2. We included a study by Kielblock3. We corresponded directly with this study team specifically to find out whether or not this was a randomised study and were informed that it was a randomised trial, albeit by date of birth. We note in our review2 that this randomisation method carries a high risk of bias. Post-hoc metaanalyses without these data demonstrate no substantial change to the effect estimates for all cause mortality (original risk ratio (RR) 0·66 [95% CI 0·54, 0·81], p<0·0001; revised RR 0·72 [95% CI 0·57, 0·92], p=0·008), all-cause hospitalisation (original RR 0·91 [95% CI 0·84, 0·99] p=0·02; revised RR 0.92 [95% CI 0·84, 1·02], p=0·10 ) or CHF-related hospitalisation (original RR 0·79 [95% CI 0·67, 0·94] p=0·008; revised RR 0·75 [95% CI 0·60, 0·94] p=0·01). Secondly, we would classify the Tele-HF study4, 5 as structured telephone support, rather than telemonitoring. Again, inclusion of these data alters the point-estimate but not the overall result of the meta-analyses4. Finally, our review2 does not include invasive telemonitoring as the search strategy was not designed to capture these studies. Therefore direct comparison of our review findings with recent studies of these interventions is not recommended.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Letter to the Editor of New England Journal of Medicine on behalf of the Cochrane Systematic Review team.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents a novel approach to confidentiality violation detection based on taint marking. Information flows are dynamically tracked between applications and objects of the operating system such as files, processes and sockets. A confidentiality policy is defined by labelling sensitive information and defining which information may leave the local system through network exchanges. Furthermore, per application profiles can be defined to restrict the sets of information each application may access and/or send through the network. In previous works, we focused on the use of mandatory access control mechanisms for information flow tracking. In this current work, we have extended the previous information flow model to track network exchanges, and we are able to define a policy attached to network sockets. We show an example application of this extension in the context of a compromised web browser: our implementation detects a confidentiality violation when the browser attempts to leak private information to a remote host over the network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports a 2-year longitudinal study on the effectiveness of the Pattern and Structure Mathematical Awareness Program (PASMAP) on students’ mathematical development. The study involved 316 Kindergarten students in 17 classes from four schools in Sydney and Brisbane. The development of the PASA assessment interview and scale are presented. The intervention program provided explicit instruction in mathematical pattern and structure that enhanced the development of students’ spatial structuring, multiplicative reasoning, and emergent generalisations. This paper presents the initial findings of the impact of the PASMAP and illustrates students’ structural development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Pattern and Structure Mathematical Awareness Program(PASMAP) stems from a 2-year longitudinal study on students’ early mathematical development. The paper outlines the interview assessment the Pattern and Structure Assessment(PASA) designed to describe students’ awareness of mathematical pattern and structure across a range of concepts. An overview of students’ performance across items and descriptions of their structural development are described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To determine whether remote monitoring (structured telephone support or telemonitoring) without regular clinic or home visits improves outcomes for patients with chronic heart failure. Data sources: 15 electronic databases, hand searches of previous studies, and contact with authors and experts. Data extraction: Two investigators independently screened the results. Review methods: Published randomised controlled trials comparing remote monitoring programmes with usual care in patients with chronic heart failure managed within the community. Results: 14 randomised controlled trials (4264 patients) of remote monitoring met the inclusion criteria: four evaluated telemonitoring, nine evaluated structured telephone support, and one evaluated both. Remote monitoring programmes reduced the rates of admission to hospital for chronic heart failure by 21% (95% confidence interval 11% to 31%) and all cause mortality by 20% (8% to 31%); of the six trials evaluating health related quality of life three reported significant benefits with remote monitoring, and of the four studies examining healthcare costs with structured telephone support three reported reduced cost and one no effect. Conclusion: Programmes for chronic heart failure that include remote monitoring have a positive effect on clinical outcomes in community dwelling patients with chronic heart failure.