930 resultados para database management system
Resumo:
Objective. The aim of this study was to survey GPs and community pharmacists (CPs) in Ireland regarding current practices of medication management, specifically medication reconciliation, communication between health care providers and medication errors as patients transition in care.
Methods. A national cross-sectional survey was distributed electronically to 2364 GPs, 311 GP Registrars and 2382 CPs. Multivariable associations comparing GPs to CPs were generated and content analysis of free text responses was undertaken.
Results. There was an overall response rate of 17.7% (897 respondents—554 GPs/Registrars and 343 CPs). More than 90% of GPs and CPs were positive about the effects of medication reconciliation on medication safety and adherence. Sixty per cent of GPs reported having no formal system of medication reconciliation. Communication between GPs and CPs was identified as good/very good by >90% of GPs and CPs. The majority (>80%) of both groups could clearly recall prescribing errors, following a transition of care, they had witnessed in the previous 6 months. Free text content analysis corroborated the positive relationship between GPs and CPs, a frustration with secondary care communication, with many examples given of prescribing errors.
Conclusions. While there is enthusiasm for the benefits of medication reconciliation there are limited formal structures in primary care to support it. Challenges in relation to systems that support inter-professional communication and reduce medication errors are features of the primary/secondary care transition. There is a need for an improved medication management system. Future research should focus on the identified barriers in implementing medication reconciliation and systems that can improve it.
Resumo:
The benefits of pavement management system when fully implemented are well known and the history of successful implementation is rich. Implementation occurs, for purposes of this paper, when the pavement management system is the critical component for making pavement decisions. This paper addresses the issues that act as barriers to full implementation of pavement management systems. Institutional barriers, not technical and financial barriers, are more commonly responsible for a pavement management systems falling short of full implementation. The paper groups these institutional issues into a general taxonomy. In general, more effort needs to be put forth by highway agencies to overcome institutional issues. Most agencies approach pavement management as a technical process, but more commonly, institutional issues become more problematic and thus require more attention paid to institutional issues. The paper concludes by summarizing the implementation process being taken by the Iowa Department of Transportation. The process was designed to overcome institutional barriers and facilitate the complete and full implementation of their pavement management system.
Resumo:
Abstract Purpose – The purpose of this paper is to present a case study regarding the deployment of a previously developed model for the integration of management systems (MSs). The case study is developed at a manufacturing site of an international enterprise. The implementation of this model in a real business environment is aimed at assessing its feasibility. Design/methodology/approach – The presented case study takes into account different management systems standards (MSSs) progressively implemented, along the years, independently. The implementation of the model was supported by the results obtained from an investigation performed according to a structured diagnosis that was conducted to collect information related to the organizational situation of the enterprise. Findings – The main findings are as follows: a robust integrated management system (IMS), objectively more lean, structured and manageable was found to be feasible; this study provided an holistic view of the enterprise’s global management; clarifications of job descriptions and boundaries of action and responsibilities were achieved; greater efficiency in the use of resources was attained; more coordinated management of the three pillars of sustainability – environmental, economic and social, as well as risks, providing confidence and added value to the company and interested parties was achieved. Originality/value – This case study is pioneering in Portugal in respect to the implementation, at the level of an industrial organization, of the model previously developed for the integration of individualized MSs. The case study provides new insights regarding the implementation of IMSs including the rationalization of several resources and elimination of several types of organizational waste leveraging gains of efficiency. Due to its intrinsic characteristics, the model is able to support, progressively, new or revised MSSs according to the principles of annex SL (normative) – proposals for MSSs – of the International Organization for Standardization and the International Electrotechnical Commission, that the industrial organization can adopt beyond the current ones.
Resumo:
This document briefly summarizes the pavement management activities under the existing Iowa Department of Transportation (DOT) Pavement Management System. The second part of the document provides projected increase in use due to the implementation of the Iowa DOT Pavement Management Optimization System. All estimates of existing time devoted to the Pavement Management System and project increases in time requirements are estimates made by the appropriate Iowa DOT office director or function manager. Included is the new Pavement Management Optimization Structure for the three main offices which will work most closely with the Pavement Management Optimization System (Materials, Design, and Program Management).
Resumo:
Indigenous communities have actively managed their environments for millennia using a diversity of resource use and conservation strategies. Clam gardens, ancient rock-walled intertidal beach terraces, represent one example of an early mariculture technology that may have been used to improve food security and confer resilience to coupled human-ocean systems. We surveyed a coastal landscape for evidence of past resource use and management to gain insight into ancient resource stewardship practices on the central coast of British Columbia, Canada. We found that clam gardens are embedded within a diverse portfolio of resource use and management strategies and were likely one component of a larger, complex resource management system. We compared clam diversity, density, recruitment, and biomass in three clam gardens and three unmodified nonwalled beaches. Evidence suggests that butter clams (Saxidomus gigantea) had 1.96 times the biomass and 2.44 times the density in clam gardens relative to unmodified beaches. This was due to a reduction in beach slope and thus an increase in the optimal tidal range where clams grow and survive best. The most pronounced differences in butter clam density between nonwalled beaches and clam gardens were found at high tidal elevations at the top of the beach. Finally, clam recruits (0.5-2 mm in length) tended to be greater in clam gardens compared to nonwalled beaches and may be attributed to the addition of shell hash by ancient people, which remains on the landscape today. As part of a broader social-ecological system, clam garden sites were among several modifications made by humans that collectively may have conferred resilience to past communities by providing reliable and diverse access to food resources.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
With the increasing significance of information technology, there is an urgent need for adequate measures of information security. Systematic information security management is one of most important initiatives for IT management. At least since reports about privacy and security breaches, fraudulent accounting practices, and attacks on IT systems appeared in public, organizations have recognized their responsibilities to safeguard physical and information assets. Security standards can be used as guideline or framework to develop and maintain an adequate information security management system (ISMS). The standards ISO/IEC 27000, 27001 and 27002 are international standards that are receiving growing recognition and adoption. They are referred to as “common language of organizations around the world” for information security. With ISO/IEC 27001 companies can have their ISMS certified by a third-party organization and thus show their customers evidence of their security measures.
Resumo:
SD card (Secure Digital Memory Card) is widely used in portable storage medium. Currently, latest researches on SD card, are mainly SD card controller based on FPGA (Field Programmable Gate Array). Most of them are relying on API interface (Application Programming Interface), AHB bus (Advanced High performance Bus), etc. They are dedicated to the realization of ultra high speed communication between SD card and upper systems. Studies about SD card controller, really play a vital role in the field of high speed cameras and other sub-areas of expertise. This design of FPGA-based file systems and SD2.0 IP (Intellectual Property core) does not only exhibit a nice transmission rate, but also achieve the systematic management of files, while retaining a strong portability and practicality. The file system design and implementation on a SD card covers the main three IP innovation points. First, the combination and integration of file system and SD card controller, makes the overall system highly integrated and practical. The popular SD2.0 protocol is implemented for communication channels. Pure digital logic design based on VHDL (Very-High-Speed Integrated Circuit Hardware Description Language), integrates the SD card controller in hardware layer and the FAT32 file system for the entire system. Secondly, the document management system mechanism makes document processing more convenient and easy. Especially for small files in batch processing, it can ease the pressure of upper system to frequently access and process them, thereby enhancing the overall efficiency of systems. Finally, digital design ensures the superior performance. For transmission security, CRC (Cyclic Redundancy Check) algorithm is for data transmission protection. Design of each module is platform-independent of macro cells, and keeps a better portability. Custom integrated instructions and interfaces may facilitate easily to use. Finally, the actual test went through multi-platform method, Xilinx and Altera FPGA developing platforms. The timing simulation and debugging of each module was covered. Finally, Test results show that the designed FPGA-based file system IP on SD card can support SD card, TF card and Micro SD with 2.0 protocols, and the successful implementation of systematic management for stored files, and supports SD bus mode. Data read and write rates in Kingston class10 card is approximately 24.27MB/s and 16.94MB/s.
Resumo:
The academic activities carried out at the School of Chemistry make indispensable to develop actions oriented toward the consolidation of a reagent and residue management system, especially in the teaching laboratories. The project “Management of reagents and residues in the teaching laboratories of the School of Chemistry” works under the Green Chemistry values which designs products and chemical processes that reduce or eliminate the use and production of dangerous substances, to benefit the environment. With a preventive vision, a change from the laboratory practices is looked to select those with less environmental impact. Additionally, residue quantification is made and its management protocols are developed for each practice. The project has several stages: diagnose, action implementation, student, teacher and administration personnel training and evaluation during the process and at the end of it. The article describes methodological aspects of the project operation emphasizing on reagent and residue quantification through flow diagrams.
Resumo:
Fisheries plays a significant and important part in the economy of the country contributing to foreign exchange, food security and employment creation. Lake Victoria contributes over 50% of the total annual fish catch. The purpose of fisheries management is to ensure conservation, protection, proper use, economic efficiency and equitable distribution of the fisheries resources both for the present and future generations through sustainable utilization. The earliest fisheries were mainly at the subsistence level. Fishing gear consisted of locally made basket traps, hooks and seine nets of papyrus. Fishing effort begun to increase with the introduction of more efficient flax gillnets in 1905. Fisheries management in Uganda started in 1914. Before then, the fishery was under some form of traditional management based on the do and don'ts. History shows that the Baganda had strong spiritual beliefs in respect of "god Mukasa" (god of the Lake) and these indirectly contributed to sustainable management of the lake. If a fisherman neglected to comply witt'l any of the ceremonies related to fishing he was expected to encounter a bad omen (Rev. Roscoe, 1965) However, with the introduction of the nylon gill nets, which could catch more fish, traditional management regime broke down. By 1955 the indigenous fish species like Oreochromis variabilis and Oreochromis esculentus had greatly declined in catches. Decline in catches led to introduction of poor fishing methods because of competition for fish. Government in an attempt to regulate the fishing irldustry enacted the first Fisheries Ordinance in 1951 and recruited Fisheries Officers to enforce them. The government put in place minimum net mesh-sizes and Fisheries Officers arrested fishermen without explaining the reason. This led to continued poor fishing practices. The development of government centred management systems led to increased alienation of resource users and to wilful disregard of specific regulations. The realisation of the problems faced by the central management system led to the recognition that user groups need to be actively involved in fisheries management if the systems are to be consistent with sustainable fisheries and be legitimate. Community participation in fisheries management under the Comanagement approach has been adopted in Lake Victoria including other water bodies.
Resumo:
The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.
Resumo:
Fault tolerance allows a system to remain operational to some degree when some of its components fail. One of the most common fault tolerance mechanisms consists on logging the system state periodically, and recovering the system to a consistent state in the event of a failure. This paper describes a general fault tolerance logging-based mechanism, which can be layered over deterministic systems. Our proposal describes how a logging mechanism can recover the underlying system to a consistent state, even if an action or set of actions were interrupted mid-way, due to a server crash. We also propose different methods of storing the logging information, and describe how to deploy a fault tolerant master-slave cluster for information replication. We adapt our model to a previously proposed framework, which provided common relational features, like transactions with atomic, consistent, isolated and durable properties, to NoSQL database management systems.
Resumo:
In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: 1) if policies are complex, their enforcement can lead to performance decay of database servers; 2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.
Resumo:
To store, update and retrieve data from database management systems (DBMS), software architects use tools, like call-level interfaces (CLI), which provide standard functionalities to interact with DBMS. However, the emerging of NoSQL paradigm, and particularly new NoSQL DBMS providers, lead to situations where some of the standard functionalities provided by CLI are not supported, very often due to their distance from the relational model or due to design constraints. As such, when a system architect needs to evolve, namely from a relational DBMS to a NoSQL DBMS, he must overcome the difficulties conveyed by the features not provided by NoSQL DBMS. Choosing the wrong NoSQL DBMS risks major issues with components requesting non-supported features. This paper focuses on how to deploy features that are not so commonly supported by NoSQL DBMS (like Stored Procedures, Transactions, Save Points and interactions with local memory structures) by implementing them in standard CLI.
Resumo:
In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: (1) if policies are complex, their enforcement can lead to performance decay of database servers; (2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.