975 resultados para 280108 Database Management
Resumo:
The Chinese welding industry is growing every year due to rapid development of the Chinese economy. Increasingly, companies around the world are looking to use Chinese enterprises as their cooperation partners. However, the Chinese welding industry also has its weaknesses, such as relatively low quality and weak management. A modern, advanced welding management system appropriate for local socio-economic conditions is required to enable Chinese enterprises to enhance further their business development. The thesis researches the design and implementation of a new welding quality management system for China. This new system is called ‗welding production quality control management model in China‘ (WQMC). Constructed on the basis of analysis of a survey and in-company interviews, the welding management system comprises the following different elements and perspectives: a ‗Localized congenital existing problem resolution strategies‘ (LCEPRS) database, a ‗human factor designed training system‘ (HFDT) training strategy, the theory of modular design, ISO 3834 requirements, total welding management (TWM), and lean manufacturing (LEAN) theory. The methods used in the research are literature review, questionnaires, interviews, and the author‘s model design experiences and observations, i.e. the approach is primarily qualitative and phenomenological. The thesis describes the design and implementation of a HFDT strategy in Chinese welding companies. Such training is an effective way to increase employees‘ awareness of quality and issues associated with quality assurance. The study identified widely existing problems in the Chinese welding industry and constructed a LCEPRS database that can be used in efforts to mitigate and avoid common problems. The work uses the theory of modular design, TWM and LEAN as tools for the implementation of the WQMC system.
Advances in therapeutic risk management through signal detection and risk minimisation tool analyses
Resumo:
Les quatre principales activités de la gestion de risque thérapeutique comportent l’identification, l’évaluation, la minimisation, et la communication du risque. Ce mémoire aborde les problématiques liées à l’identification et à la minimisation du risque par la réalisation de deux études dont les objectifs sont de: 1) Développer et valider un outil de « data mining » pour la détection des signaux à partir des banques de données de soins de santé du Québec; 2) Effectuer une revue systématique afin de caractériser les interventions de minimisation de risque (IMR) ayant été implantées. L’outil de détection de signaux repose sur la méthode analytique du quotient séquentiel de probabilité (MaxSPRT) en utilisant des données de médicaments délivrés et de soins médicaux recueillis dans une cohorte rétrospective de 87 389 personnes âgées vivant à domicile et membres du régime d’assurance maladie du Québec entre les années 2000 et 2009. Quatre associations « médicament-événement indésirable (EI) » connues et deux contrôles « négatifs » ont été utilisés. La revue systématique a été faite à partir d’une revue de la littérature ainsi que des sites web de six principales agences réglementaires. La nature des RMIs ont été décrites et des lacunes de leur implémentation ont été soulevées. La méthode analytique a mené à la détection de signaux dans l'une des quatre combinaisons médicament-EI. Les principales contributions sont: a) Le premier outil de détection de signaux à partir des banques de données administratives canadiennes; b) Contributions méthodologiques par la prise en compte de l'effet de déplétion des sujets à risque et le contrôle pour l'état de santé du patient. La revue a identifié 119 IMRs dans la littérature et 1,112 IMRs dans les sites web des agences réglementaires. La revue a démontré qu’il existe une augmentation des IMRs depuis l’introduction des guides réglementaires en 2005 mais leur efficacité demeure peu démontrée.
Resumo:
Introduction: Avec l’abondance d’information gratuite disponible en ligne, la tâche de trouver, de trier et d’acheminer de l’information pertinente à l’auditoire approprié peut s’avérer laborieuse. En décembre 2010, la Bibliothèque virtuelle canadienne de santé / Canadian Virtual Health Library (BVCS) a formé un comité d’experts afin d’identifier, d’évaluer, de sélectionner et d’organiser des ressources d’intérêt pour les professionnels de la santé. Méthodes: Cette affiche identifiera les décisions techniques du comité d’experts, incluant le système de gestion de contenus retenu, l’utilisation des éléments Dublin Core et des descripteurs Medical Subject Headings pour la description des ressources, et le développement et l’adaptation de taxonomies à partir de la classification MeSH. La traduction française des descripteurs MeSH à l’aide du portail CISMeF sera également abordée. Résultats: Au mois de mai 2011, le comité a lancé la base de données BVCS de ressources en ligne gratuites sur la santé, regroupant plus de 1600 sites web et ressources. Une variété de types de contenus sont représentés, incluant des articles et rapports, des bases de données interactives et des outils de pratique clinique. Discussion: Les bénéfices et défis d’une collaboration pancanadienne virtuelle seront présentés, ainsi que l’inclusion cruciale d’un membre francophone pour composer avec la nature bilingue de la base de données. En lien avec cet aspect du projet, l’affiche sera présentée en français et en anglais. Introduction: With the abundance of freely available online information, the task of finding, filtering and fitting relevant information to the appropriate audience, is daunting. In December 2010 the Canadian Virtual Health Library / Bibliothèque virtuelle canadienne de santé (CVHL) formed an expert committee to identify, evaluate, select and organize resources relevant to health professionals. Methods: This poster will identify the key technical decisions of the expert committee including the content management system used to manage the data, the use of Dublin Core elements and Medical Subject Headings to describe the resources, and the development and adaptation of taxonomies from MeSH classification to catalog resources. The translation of MeSH terms to French using the CiSMeF portal will also be discussed. Results: In May 2010, the committee launched the CVHL database of free web-based health resources. Content ranged from online articles and reports to videos, interactive databases and clinical practice tools, and included more than 1,600 websites and resources. Discussion: The benefits and challenges of a virtual, pan-Canadian collaboration, and the critical inclusion of a Francophone member to address the bilingual nature of the database, will be presented. In keeping with the nature of the project, the poster will be presented in French and English.
Resumo:
Monitor a distribution network implies working with a huge amount of data coining from the different elements that interact in the network. This paper presents a visualization tool that simplifies the task of searching the database for useful information applicable to fault management or preventive maintenance of the network
Resumo:
Our ability to identify, acquire, store, enquire on and analyse data is increasing as never before, especially in the GIS field. Technologies are becoming available to manage a wider variety of data and to make intelligent inferences on that data. The mainstream arrival of large-scale database engines is not far away. The experience of using the first such products tells us that they will radically change data management in the GIS field.
Resumo:
This paper describes a case study of an electronic data management system developed in-house by the Facilities Management Directorate (FMD) of an educational institution in the UK. The FMD Maintenance and Business Services department is responsible for the maintenance of the built-estate owned by the university. The department needs to have a clear definition of the type of work undertaken and the administration that enables any maintenance work to be carried out. These include the management of resources, budget, cash flow and workflow of reactive, preventative and planned maintenance of the campus. In order to be more efficient in supporting the business process, the FMD had decided to move from a paper-based information system to an electronic system, WREN, to support the business process of the FMD. Some of the main advantages of WREN are that it is tailor-made to fit the purpose of the users; it is cost effective when it comes to modifications on the system; and the database can also be used as a knowledge management tool. There is a trade-off; as WREN is tailored to the specific requirements of the FMD, it may not be easy to implement within a different institution without extensive modifications. However, WREN is successful in not only allowing the FMD to carry out the tasks of maintaining and looking after the built-estate of the university, but also has achieved its aim to minimise costs and maximise efficiency.
Resumo:
Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.
Resumo:
The prospect of a European Supergrid calls for research on aggregate electricity peak demand and Europe-wide Demand Side Management. No attempt has been made as yet to represent a time-related demand curve of residential electricity consumption at the European level. This article assesses how active occupancy levels of single-person households vary in single-person household in 15 European countries. It makes use of occupancy time-series data from the Harmonised European Time Use Survey database to build European occupancy curves; identify peak occupancy periods; construct time-related electricity demand curves for TV and video watching activities and assess occupancy variances of single-person households.
Resumo:
Usually, a Petri net is applied as an RFID model tool. This paper, otherwise, presents another approach to the Petri net concerning RFID systems. This approach, called elementary Petri net inside an RFID distributed database, or PNRD, is the first step to improve RFID and control systems integration, based on a formal data structure to identify and update the product state in real-time process execution, allowing automatic discovery of unexpected events during tag data capture. There are two main features in this approach: to use RFID tags as the object process expected database and last product state identification; and to apply Petri net analysis to automatically update the last product state registry during reader data capture. RFID reader data capture can be viewed, in Petri nets, as a direct analysis of locality for a specific transition that holds in a specific workflow. Following this direction, RFID readers storage Petri net control vector list related to each tag id is expected to be perceived. This paper presents PNRD cornerstones and a PNRD implementation example in software called DEMIS Distributed Environment in Manufacturing Information Systems.
Resumo:
GCM outputs such as CMIP3 are available via network access to PCMDI web site. Meteorological researchers are familiar with the usage of the GCM data, but the most of researchers other than meteorology such as agriculture, civil engineering, etc., and general people are not familiar with the GCM. There are some difficulties to use GCM; 1) to download the enormous quantity of data, 2) to understand the GCM methodology, parameters and grids. In order to provide a quick access way to GCM, Climate Change Information Database has been developed. The purpose of the database is to bridge the users and meteorological specialists and to facilitate the understanding the climate changes. The resolution of the data is unified, and climate change amount or factors for each meteorological element are provided from the database. All data in the database are interpolated on the same 80km mesh. Available data are the present-future projections of 27 GCMs, 16 meteorological elements (precipitation, temperature, etc.), 3 emission scenarios (A1B, A2, B1). We showed the summary of this database to residents in Toyama prefecture and measured the effect of showing and grasped the image for the climate change by using the Internet questionary survey. The persons who feel a climate change at the present tend to feel the additional changes in the future. It is important to show the monitoring results of climate change for a citizen and promote the understanding for the climate change that had already occurred. It has been shown that general images for the climate change promote to understand the need of the mitigation, and that it is important to explain about the climate change that might occur in the future even if it did not occur at the present in order to have people recognize widely the need of the adaptation.
Resumo:
Driven by Web 2.0 technology and the almost ubiquitous presence of mobile devices, Volunteered Geographic Information (VGI) is knowing an unprecedented growth. These notable technological advancements have opened fruitful perspectives also in the field of water management and protection, raising the demand for a reconsideration of policies which also takes into account the emerging trend of VGI. This research investigates the opportunity of leveraging such technology to involve citizens equipped with common mobile devices (e.g. tablets and smartphones) in a campaign of report of water-related phenomena. The work is carried out in collaboration with ADBPO - Autorità di bacino del fiume Po (Po river basin Authority), i.e. the entity responsible for the environmental planning and protection of the basin of river Po. This is the longest Italian river, spreading over eight among the twenty Italian Regions and characterized by complex environmental issues. To enrich ADBPO official database with user-generated contents, a FOSS (Free and Open Source Software) architecture was designed which allows not only user field-data collection, but also data Web publication through standard protocols. Open Data Kit suite allows users to collect georeferenced multimedia information using mobile devices equipped with location sensors (e.g. the GPS). Users can report a number of environmental emergencies, problems or simple points of interest related to the Po river basin, taking pictures of them and providing other contextual information. Field-registered data is sent to a server and stored into a PostgreSQL database with PostGIS spatial extension. GeoServer provides then data dissemination on the Web, while specific OpenLayers-based viewers were built to optimize data access on both desktop computers and mobile devices. Besides proving the suitability of FOSS in the frame of VGI, the system represents a successful prototype for the exploitation of user local, real-time information aimed at managing and protecting water resources.
Resumo:
The goal of this study was to analyze international scientific production in the area of constant-production green supply chain management from 2001 to 2012 using the Business Source Complete database (EBSCO Host). The database was checked for cooperation between authors and institutions, author entrants, production and continuity categories, regularity of publication and distribution of publications over time. Ninety articles were included in the sample, and the results showed a reduction in the number of publishing authors, concentrated in one-timers category with 68.90%. The highest yield for a single author was 10 articles, and the most prolific periodical was the Journal of Cleaner Production, with 12 articles published on the subject. Clark University (USA) stood out in terms of output, with 12 affiliated authors. It was concluded that the subject had experienced a significant rise in published literature over that time period.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The Green Supply Chain Management (GSCM) is gaining prominence in the academy and business, as an approach that aims to promote economic and environmental gains. The GSCM is operated through the Environmental Management System Tools and treated as an Environmental Management System (EMS), involving Reverse Logistics, Green Purchasing, Green Sourcing, Green Design, Green Packaging, Green Operation, Green Manufacturing, Green Innovation and Customer Awareness. The objective of this study is to map the GSCM tools and identify their practice in a consumer goods industry in the Vale do Paraiba. The approach and data collection were made in the company's database chosen as the object of study, as well as through on site visits and interviews. The results showed that the tools Green Operation, Green Manufacturing, Green Innovation and Green Sourcing are applied in the company and just Costumer Awareness tool showed no practice at all. To other tools was identified ideology or interest of the company in applying them
Resumo:
Objective—To identify major environmental and farm management factors associated with the occurrence of tuberculosis (TB) on cattle farms in northeastern Michigan. Design—Case-control study. Sample Population—17 cattle farms with infected cattle and 51 control farms. Procedure—Each case farm (laboratory confirmed diagnosis of Mycobacterium bovis infection) was matched with 2 to 4 control farms (negative whole-herd test results within previous 12 months) on the basis of type of farm (dairy or beef) and location. Cattle farm data were collected from in-person interviews and mailed questionnaires. Wildlife TB data were gathered through state wildlife surveillance. Environmental data were gathered from a satellite image-based geographic information system. Multivariable conditional logistic regression for matched analysis was performed. Results—Major factors associated with increased farm risk of TB were higher TB prevalence among wild deer and cattle farms in the area, herd size, and ponds or creeks in cattle housing areas. Factors associated with reduced farm risk of TB were greater amounts of natural open lands in the surrounding area and reducing deer access to cattle housing areas by housing cattle in barns, barnyards, or feedlots and use of electrified wire or barbed wire for livestock fencing. Conclusions and Clinical Relevance—Results suggest that certain environmental and management factors may be associated with risk of TB on cattle farms.