959 resultados para Hospitals--Ontario--Administration--Data processing--Case


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This case study examines the impact of a computer information system as it was being implemented in one Ontario hospital. The attitudes of a cross section of the hospital staff acted as a barometer to measure their perceptions of the implementation process. With The Mississauga Hospital in the early stages of an extensive computer implementation project, the opportunity existed to identify staff attitudes about the computer system, overall knowledge and compare the findings with the literature. The goal of the study was to develop a greater base about the affective domain in the relationship between people and the computer system. Eight exploratory questions shaped the focus of the investigation. Data were collected from three sources: a survey questionnaire, focused interviews, and internal hospital documents. Both quantitative and qualitative data were analyzed. Instrumentation in the study consisted of a survey distributed at two points in time to randomly selected hospital employees who represented all staff levels.Other sources of data included hospital documents, and twenty-five focused interviews with staff who replied to both surveys. Leavitt's socio-technical system, with its four subsystems: task, structure, technology, and people was used to classify staff responses to the research questions. The study findings revealed that the majority of respondents felt positive about using the computer as part of their jobs. No apparent correlations were found between sex, age, or staff group and feelings about using the computer. Differences in attitudes, and attitude changes were found in potential relationship to the element of time. Another difference was found in staff group and perception of being involved in the decision making process. These findings and other evidence about the role of change agents in this change process help to emphasize that planning change is one thing, managing the transition is another.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In their safety evaluations of bisphenol A (BPA), the U.S. Food and Drug Administration (FDA) and a counterpart in Europe, the European Food Safety Authority (EFSA), have given special prominence to two industry-funded studies that adhered to standards defined by Good Laboratory Practices (GLP). These same agencies have given much less weight in risk assessments to a large number of independently replicated non-GLP studies conducted with government funding by the leading experts in various fields of science from around the world. OBJECTIVES: We reviewed differences between industry-funded GLP studies of BPA conducted by commercial laboratories for regulatory purposes and non-GLP studies conducted in academic and government laboratories to identify hazards and molecular mechanisms mediating adverse effects. We examined the methods and results in the GLP studies that were pivotal in the draft decision of the U.S. FDA declaring BPA safe in relation to findings from studies that were competitive for U.S. National Institutes of Health (NIH) funding, peer-reviewed for publication in leading journals, subject to independent replication, but rejected by the U.S. FDA for regulatory purposes. DISCUSSION: Although the U.S. FDA and EFSA have deemed two industry-funded GLP studies of BPA to be superior to hundreds of studies funded by the U.S. NIH and NIH counterparts in other countries, the GLP studies on which the agencies based their decisions have serious conceptual and methodologic flaws. In addition, the U.S. FDA and EFSA have mistakenly assumed that GLP yields valid and reliable scientific findings (i.e., "good science"). Their rationale for favoring GLP studies over hundreds of publically funded studies ignores the central factor in determining the reliability and validity of scientific findings, namely, independent replication, and use of the most appropriate and sensitive state-of-the-art assays, neither of which is an expectation of industry-funded GLP research. CONCLUSIONS: Public health decisions should be based on studies using appropriate protocols with appropriate controls and the most sensitive assays, not GLP. Relevant NIH-funded research using state-of-the-art techniques should play a prominent role in safety evaluations of chemicals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we focus on providing coordinated visual strategies to assist users in performing tasks driven by the presence of temporal and spatial attributes. We introduce temporal visualization techniques targeted at such tasks, and illustrate their use with an application involving a climate classification process. The climate classification requires extensive Processing of a database containing daily rain precipitation values collected along over fifty years at several spatial locations in the São Paulo state, Brazil. We identify user exploration tasks typically conducted as part of the data preparation required in this process, and then describe how such tasks may be assisted by the multiple visual techniques provided. Issues related to the use of the multiple techniques by an end-user are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PAMELA (Phased Array Monitoring for Enhanced Life Assessment) SHMTM System is an integrated embedded ultrasonic guided waves based system consisting of several electronic devices and one system manager controller. The data collected by all PAMELA devices in the system must be transmitted to the controller, who will be responsible for carrying out the advanced signal processing to obtain SHM maps. PAMELA devices consist of hardware based on a Virtex 5 FPGA with a PowerPC 440 running an embedded Linux distribution. Therefore, PAMELA devices, in addition to the capability of performing tests and transmitting the collected data to the controller, have the capability of perform local data processing or pre-processing (reduction, normalization, pattern recognition, feature extraction, etc.). Local data processing decreases the data traffic over the network and allows CPU load of the external computer to be reduced. Even it is possible that PAMELA devices are running autonomously performing scheduled tests, and only communicates with the controller in case of detection of structural damages or when programmed. Each PAMELA device integrates a software management application (SMA) that allows to the developer downloading his own algorithm code and adding the new data processing algorithm to the device. The development of the SMA is done in a virtual machine with an Ubuntu Linux distribution including all necessary software tools to perform the entire cycle of development. Eclipse IDE (Integrated Development Environment) is used to develop the SMA project and to write the code of each data processing algorithm. This paper presents the developed software architecture and describes the necessary steps to add new data processing algorithms to SMA in order to increase the processing capabilities of PAMELA devices.An example of basic damage index estimation using delay and sum algorithm is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new methodology is proposed to produce subsidence activity maps based on the geostatistical analysis of persistent scatterer interferometry (PSI) data. PSI displacement measurements are interpolated based on conditional Sequential Gaussian Simulation (SGS) to calculate multiple equiprobable realizations of subsidence. The result from this process is a series of interpolated subsidence values, with an estimation of the spatial variability and a confidence level on the interpolation. These maps complement the PSI displacement map, improving the identification of wide subsiding areas at a regional scale. At a local scale, they can be used to identify buildings susceptible to suffer subsidence related damages. In order to do so, it is necessary to calculate the maximum differential settlement and the maximum angular distortion for each building of the study area. Based on PSI-derived parameters those buildings in which the serviceability limit state has been exceeded, and where in situ forensic analysis should be made, can be automatically identified. This methodology has been tested in the city of Orihuela (SE Spain) for the study of historical buildings damaged during the last two decades by subsidence due to aquifer overexploitation. The qualitative evaluation of the results from the methodology carried out in buildings where damages have been reported shows a success rate of 100%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bibliography: p. 34.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06