923 resultados para Management of Computing and Information Systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tedd, L.(2006). Program: a record of the first 40 years of electronic library and information systems. Program: electronic library and information systems,40(1), 11-26.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Several trials have demonstrated the efficacy of nurse telephone case management for diabetes (DM) and hypertension (HTN) in academic or vertically integrated systems. Little is known about the real-world potency of these interventions. OBJECTIVE: To assess the effectiveness of nurse behavioral management of DM and HTN in community practices among patients with both diseases. DESIGN: The study was designed as a patient-level randomized controlled trial. PARTICIPANTS: Participants included adult patients with both type 2 DM and HTN who were receiving care at one of nine community fee-for-service practices. Subjects were required to have inadequately controlled DM (hemoglobin A1c [A1c] ≥ 7.5%) but could have well-controlled HTN. INTERVENTIONS: All patients received a call from a nurse experienced in DM and HTN management once every two months over a period of two years, for a total of 12 calls. Intervention patients received tailored DM- and HTN- focused behavioral content; control patients received non-tailored, non-interactive information regarding health issues unrelated to DM and HTN (e.g., skin cancer prevention). MAIN OUTCOMES AND MEASURES: Systolic blood pressure (SBP) and A1c were co-primary outcomes, measured at 6, 12, and 24 months; 24 months was the primary time point. RESULTS: Three hundred seventy-seven subjects were enrolled; 193 were randomized to intervention, 184 to control. Subjects were 55% female and 50% white; the mean baseline A1c was 9.1% (SD = 1%) and mean SBP was 142 mmHg (SD = 20). Eighty-two percent of scheduled interviews were conducted; 69% of intervention patients and 70% of control patients reached the 24-month time point. Expressing model estimated differences as (intervention--control), at 24 months, intervention patients had similar A1c [diff = 0.1 %, 95 % CI (-0.3, 0.5), p = 0.51] and SBP [diff = -0.9 mmHg, 95% CI (-5.4, 3.5), p = 0.68] values compared to control patients. Likewise, DBP (diff = 0.4 mmHg, p = 0.76), weight (diff = 0.3 kg, p = 0.80), and physical activity levels (diff = 153 MET-min/week, p = 0.41) were similar between control and intervention patients. Results were also similar at the 6- and 12-month time points. CONCLUSIONS: In nine community fee-for-service practices, telephonic nurse case management did not lead to improvement in A1c or SBP. Gains seen in telephonic behavioral self-management interventions in optimal settings may not translate to the wider range of primary care settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded electronic systems in vehicles are of rapidly increasing commercial importance for the automotive industry. While current vehicular embedded systems are extremely limited and static, a more dynamic configurable system would greatly simplify the integration work and increase quality of vehicular systems. This brings in features like separation of concerns, customised software configuration for individual vehicles, seamless connectivity, and plug-and-play capability. Furthermore, such a system can also contribute to increased dependability and resource optimization due to its inherent ability to adjust itself dynamically to changes in software, hardware resources, and environment condition. This paper describes the architectural approach to achieving the goals of dynamically self-configuring automotive embedded electronic systems by the EU research project DySCAS. The architecture solution outlined in this paper captures the application and operational contexts, expected features, middleware services, functions and behaviours, as well as the basic mechanisms and technologies. The paper also covers the architecture conceptualization by presenting the rationale, concerning the architecture structuring, control principles, and deployment concept. In this paper, we also present the adopted architecture V&V strategy and discuss some open issues in regards to the industrial acceptance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim The aim of the study is to evaluate factors that enable or constrain the implementation and service delivery of early warnings systems or acute care training in practice. Background To date there is limited evidence to support the effectiveness of acute care initiatives (early warning systems, acute care training, outreach) in reducing the number of adverse events (cardiac arrest, death, unanticipated Intensive Care admission) through increased recognition and management of deteriorating ward based patients in hospital [1-3]. The reasons posited are that previous research primarily focused on measuring patient outcomes following the implementation of an intervention or programme without considering the social factors (the organisation, the people, external influences) which may have affected the process of implementation and hence measured end-points. Further research which considers the social processes is required in order to understand why a programme works, or does not work, in particular circumstances [4]. Method The design is a multiple case study approach of four general wards in two acute hospitals where Early Warning Systems (EWS) and Acute Life-threatening Events Recognition and Treatment (ALERT) course have been implemented. Various methods are being used to collect data about individual capacities, interpersonal relationships and institutional balance and infrastructures in order to understand the intended and unintended process outcomes of implementing EWS and ALERT in practice. This information will be gathered from individual and focus group interviews with key participants (ALERT facilitators, nursing and medical ALERT instructors, ward managers, doctors, ward nurses and health care assistants from each hospital); non-participant observation of ward organisation and structure; audit of patients' EWS charts and audit of the medical notes of patients who deteriorated during the study period to ascertain whether ALERT principles were followed. Discussion & progress to date This study commenced in January 2007. Ethical approval has been granted and data collection is ongoing with interviews being conducted with key stakeholders. The findings from this study will provide evidence for policy-makers to make informed decisions regarding the direction for strategic and service planning of acute care services to improve the level of care provided to acutely ill patients in hospital. References 1. Esmonde L, McDonnell A, Ball C, Waskett C, Morgan R, Rashidain A et al. Investigating the effectiveness of Critical Care Outreach Services: A systematic review. Intensive Care Medicine 2006; 32: 1713-1721 2. McGaughey J, Alderdice F, Fowler R, Kapila A, Mayhew A, Moutray M. Outreach and Early Warning Systems for the prevention of Intensive Care admission and death of critically ill patients on general hospital wards. Cochrane Database of Systematic Reviews 2007, Issue 3. www.thecochranelibrary.com 3. Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ (2007) Rapid Response Systems: A systematic review. Critical Care Medicine 2007; 35 (5): 1238-43 4. Pawson R and Tilley N. Realistic Evaluation. London; Sage: 1997

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper identifies novel approaches to future small and medium enterprise (SME) research from a review of articles, and then introduces the papers in this AJIS special section which evidence these approaches. More specifically, the paper makes an important contribution by reviewing 61 articles in high ranked IS journals (2000-2014) and introducing three new facets which are used to analyse research on SME adoption/use of IS (units of analysis, SME sizes and SME types) not considered in previous literature review studies. These facets provide thebasis for proposing various future research opportunities. The editorial then introduces the four papers in this special section covering the research theme on SMEs, and highlights the contributions they make using the three facets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Runtime management of distributed information systems is a complex and costly activity. One of the main challenges that must be addressed is obtaining a complete and updated view of all the managed runtime resources. This article presents a monitoring architecture for heterogeneous and distributed information systems. It is composed of two elements: an information model and an agent infrastructure. The model negates the complexity and variability of these systems and enables the abstraction over non-relevant details. The infrastructure uses this information model to monitor and manage the modeled environment, performing and detecting changes in execution time. The agents infrastructure is further detailed and its components and the relationships between them are explained. Moreover, the proposal is validated through a set of agents that instrument the JEE Glassfish application server, paying special attention to support distributed configuration scenarios.