971 resultados para Electronic document identification systems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Introduction: There is increasing evidence that electronic prescribing (ePrescribing) or computerised provider/physician order entry (CPOE) systems can improve the quality and safety of healthcare services. However, it has also become clear that their implementation is not straightforward and may create unintended or undesired consequences once in use. In this context, qualitative approaches have been particularly useful and their interpretative synthesis could make an important and timely contribution to the field. This review will aim to identify, appraise and synthesise qualitative studies on ePrescribing/CPOE in hospital settings, with or without clinical decision support. Methods and analysis: Data sources will include the following bibliographic databases: MEDLINE, MEDLINE In Process, EMBASE, PsycINFO, Social Policy and Practice via Ovid, CINAHL via EBSCO, The Cochrane Library (CDSR, DARE and CENTRAL databases), Nursing and Allied Health Sources, Applied Social Sciences Index and Abstracts via ProQuest and SCOPUS. In addition, other sources will be searched for ongoing studies (ClinicalTrials.gov) and grey literature: Healthcare Management Information Consortium, Conference Proceedings Citation Index (Web of Science) and Sociological abstracts. Studies will be independently screened for eligibility by 2 reviewers. Qualitative studies, either standalone or in the context of mixed-methods designs, reporting the perspectives of any actors involved in the implementation, management and use of ePrescribing/CPOE systems in hospital-based care settings will be included. Data extraction will be conducted by 2 reviewers using a piloted form. Quality appraisal will be based on criteria from the Critical Appraisal Skills Programme checklist and Standards for Reporting Qualitative Research. Studies will not be excluded based on quality assessment. A postsynthesis sensitivity analysis will be undertaken. Data analysis will follow the thematic synthesis method. Ethics and dissemination: The study does not require ethical approval as primary data will not be collected. The results of the study will be published in a peer-reviewed journal and presented at relevant conferences.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main objective for physics based modeling of the power converter components is to design the whole converter with respect to physical and operational constraints. Therefore, all the elements and components of the energy conversion system are modeled numerically and combined together to achieve the whole system behavioral model. Previously proposed high frequency (HF) models of power converters are based on circuit models that are only related to the parasitic inner parameters of the power devices and the connections between the components. This dissertation aims to obtain appropriate physics-based models for power conversion systems, which not only can represent the steady state behavior of the components, but also can predict their high frequency characteristics. The developed physics-based model would represent the physical device with a high level of accuracy in predicting its operating condition. The proposed physics-based model enables us to accurately develop components such as; effective EMI filters, switching algorithms and circuit topologies [7]. One of the applications of the developed modeling technique is design of new sets of topologies for high-frequency, high efficiency converters for variable speed drives. The main advantage of the modeling method, presented in this dissertation, is the practical design of an inverter for high power applications with the ability to overcome the blocking voltage limitations of available power semiconductor devices. Another advantage is selection of the best matching topology with inherent reduction of switching losses which can be utilized to improve the overall efficiency. The physics-based modeling approach, in this dissertation, makes it possible to design any power electronic conversion system to meet electromagnetic standards and design constraints. This includes physical characteristics such as; decreasing the size and weight of the package, optimized interactions with the neighboring components and higher power density. In addition, the electromagnetic behaviors and signatures can be evaluated including the study of conducted and radiated EMI interactions in addition to the design of attenuation measures and enclosures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background As the use of electronic health records (EHRs) becomes more widespread, so does the need to search and provide effective information discovery within them. Querying by keyword has emerged as one of the most effective paradigms for searching. Most work in this area is based on traditional Information Retrieval (IR) techniques, where each document is compared individually against the query. We compare the effectiveness of two fundamentally different techniques for keyword search of EHRs. Methods We built two ranking systems. The traditional BM25 system exploits the EHRs' content without regard to association among entities within. The Clinical ObjectRank (CO) system exploits the entities' associations in EHRs using an authority-flow algorithm to discover the most relevant entities. BM25 and CO were deployed on an EHR dataset of the cardiovascular division of Miami Children's Hospital. Using sequences of keywords as queries, sensitivity and specificity were measured by two physicians for a set of 11 queries related to congenital cardiac disease. Results Our pilot evaluation showed that CO outperforms BM25 in terms of sensitivity (65% vs. 38%) by 71% on average, while maintaining the specificity (64% vs. 61%). The evaluation was done by two physicians. Conclusions Authority-flow techniques can greatly improve the detection of relevant information in EHRs and hence deserve further study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In broad terms — including a thief's use of existing credit card, bank, or other accounts — the number of identity fraud victims in the United States ranges 9-10 million per year, or roughly 4% of the US adult population. The average annual theft per stolen identity was estimated at $6,383 in 2006, up approximately 22% from $5,248 in 2003; an increase in estimated total theft from $53.2 billion in 2003 to $56.6 billion in 2006. About three million Americans each year fall victim to the worst kind of identity fraud: new account fraud. Names, Social Security numbers, dates of birth, and other data are acquired fraudulently from the issuing organization, or from the victim then these data are used to create fraudulent identity documents. In turn, these are presented to other organizations as evidence of identity, used to open new lines of credit, secure loans, “flip” property, or otherwise turn a profit in a victim's name. This is much more time consuming — and typically more costly — to repair than fraudulent use of existing accounts. This research borrows from well-established theoretical backgrounds, in an effort to answer the question – what is it that makes identity documents credible? Most importantly, identification of the components of credibility draws upon personal construct psychology, the underpinning for the repertory grid technique, a form of structured interviewing that arrives at a description of the interviewee’s constructs on a given topic, such as credibility of identity documents. This represents substantial contribution to theory, being the first research to use the repertory grid technique to elicit from experts, their mental constructs used to evaluate credibility of different types of identity documents reviewed in the course of opening new accounts. The research identified twenty-one characteristics, different ones of which are present on different types of identity documents. Expert evaluations of these documents in different scenarios suggest that visual characteristics are most important for a physical document, while authenticated personal data are most important for a digital document.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We thank Dr. R. Yang (formerly at ASU), Dr. R.-Q. Su (formerly at ASU), and Mr. Zhesi Shen for their contributions to a number of original papers on which this Review is partly based. This work was supported by ARO under Grant No. W911NF-14-1-0504. W.-X. Wang was also supported by NSFC under Grants No. 61573064 and No. 61074116, as well as by the Fundamental Research Funds for the Central Universities, Beijing Nova Programme.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

L’automatisation de la détection et de l’identification des animaux est une tâche qui a de l’intérêt dans plusieurs domaines de recherche en biologie ainsi que dans le développement de systèmes de surveillance électronique. L’auteur présente un système de détection et d’identification basé sur la vision stéréo par ordinateur. Plusieurs critères sont utilisés pour identifier les animaux, mais l’accent a été mis sur l’analyse harmonique de la reconstruction en temps réel de la forme en 3D des animaux. Le résultat de l’analyse est comparé avec d’autres qui sont contenus dans une base évolutive de connaissances.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Les déficits cognitifs sont centraux à la psychose et sont observables plusieurs années avant le premier épisode psychotique. L’atteinte de la mémoire épisodique est fréquemment identifiée comme une des plus sévères, tant chez les patients qu’avant l’apparition de la pathologie chez des populations à risque. Chez les patients psychotiques, l’étude neuropsychologique des processus mnésiques a permis de mieux comprendre l’origine de cette atteinte. Une altération des processus de mémoire de source qui permettent d’associer un souvenir à son origine a ainsi été identifiée et a été associée aux symptômes positifs de psychose, principalement aux hallucinations. La mémoire de source de même que la présence de symptômes sous-cliniques n’ont pourtant jamais été investiguées avant l’apparition de la maladie chez une population à haut risque génétique de psychose (HRG). Or, leur étude permettrait de voir si les déficits en mémoire de source de même que le vécu d’expériences hallucinatoires sont associés à l’apparition de la psychose ou s’ils en précèdent l’émergence, constituant alors des indicateurs précoces de pathologie. Afin d’étudier cette question, trois principaux objectifs ont été poursuivis par la présente thèse : 1) caractériser le fonctionnement de la mémoire de source chez une population HRG afin d’observer si une atteinte de ce processus précède l’apparition de la maladie, 2) évaluer si des manifestations sous-cliniques de symptômes psychotiques, soit les expériences hallucinatoires, sont identifiables chez une population à risque et 3) investiguer si un lien est présent entre le fonctionnement en mémoire de source et la symptomatologie sous-clinique chez une population à risque, à l’instar de ce qui est documenté chez les patients. Les résultats de la thèse ont permis de démontrer que les HRG présentent une atteinte de la mémoire de source ciblée à l’attribution du contexte temporel des souvenirs, ainsi que des distorsions mnésiques qui se manifestent par une fragmentation des souvenirs et par une défaillance de la métacognition en mémoire. Il a également été observé que les expériences hallucinatoires sous-cliniques étaient plus fréquentes chez les HRG. Des associations ont été documentées entre certaines distorsions en mémoire et la propension à halluciner. Ces résultats permettent d’identifier de nouveaux indicateurs cliniques et cognitifs du risque de développer une psychose et permettent de soulever des hypothèses liant l’attribution de la source interne-externe de l’information et le développement de la maladie. Les implications empiriques, théoriques, méthodologiques et cliniques de la thèse sont discutées.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present paper introduces a technology-enhanced teaching method that promotes deep learning. Four stages that correspond to four different student cohorts were used for its development and to analyse its effectiveness. The effectiveness of the method has been assessed in terms of examination results as well as results obtained from class response system software statistics. The evidence gathered indicates that the method developed is very effective and its implementation is straightforward. Furthermore, its success in achieving results seems to be independent of the skills and/or experience of the lecturer.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Power system engineers face a double challenge: to operate electric power systems within narrow stability and security margins, and to maintain high reliability. There is an acute need to better understand the dynamic nature of power systems in order to be prepared for critical situations as they arise. Innovative measurement tools, such as phasor measurement units, can capture not only the slow variation of the voltages and currents but also the underlying oscillations in a power system. Such dynamic data accessibility provides us a strong motivation and a useful tool to explore dynamic-data driven applications in power systems. To fulfill this goal, this dissertation focuses on the following three areas: Developing accurate dynamic load models and updating variable parameters based on the measurement data, applying advanced nonlinear filtering concepts and technologies to real-time identification of power system models, and addressing computational issues by implementing the balanced truncation method. By obtaining more realistic system models, together with timely updated parameters and stochastic influence consideration, we can have an accurate portrait of the ongoing phenomena in an electrical power system. Hence we can further improve state estimation, stability analysis and real-time operation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the continued miniaturization and increasing performance of electronic devices, new technical challenges have arisen. One such issue is delamination occurring at critical interfaces inside the device. This major reliability issue can occur during the manufacturing process or during normal use of the device. Proper evaluation of the adhesion strength of critical interfaces early in the product development cycle can help reduce reliability issues and time-to-market of the product. However, conventional adhesion strength testing is inherently limited in the face of package miniaturization, which brings about further technical challenges to quantify design integrity and reliability. Although there are many different interfaces in today's advanced electronic packages, they can be generalized into two main categories: 1) rigid to rigid connections with a thin flexible polymeric layer in between, or 2) a thin film membrane on a rigid structure. Knowing that every technique has its own advantages and disadvantages, multiple testing methods must be enhanced and developed to be able to accommodate all the interfaces encountered for emerging electronic packaging technologies. For evaluating the adhesion strength of high adhesion strength interfaces in thin multilayer structures a novel adhesion test configuration called “single cantilever adhesion test (SCAT)” is proposed and implemented for an epoxy molding compound (EMC) and photo solder resist (PSR) interface. The test method is then shown to be capable of comparing and selecting the stronger of two potential EMC/PSR material sets. Additionally, a theoretical approach for establishing the applicable testing domain for a four-point bending test method was presented. For evaluating polymeric films on rigid substrates, major testing challenges are encountered for reducing testing scatter and for factoring in the potentially degrading effect of environmental conditioning on the material properties of the film. An advanced blister test with predefined area test method was developed that considers an elasto-plastic analytical solution and implemented for a conformal coating used to prevent tin whisker growth. The advanced blister testing with predefined area test method was then extended by employing a numerical method for evaluating the adhesion strength when the polymer’s film properties are unknown.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This dissertation explores three aspects of the economics and policy issues surrounding retail payments (low-value frequent payments): the microeconomic aspect, by measuring costs associated with retail payment instruments; the macroeconomic aspect, by quantifying the impact of the use of electronic rather than paper-based payment instruments on consumption and GDP; and the policy aspect, by identifying barriers that keep countries stuck with outdated payment systems, and recommending policy interventions to move forward with payments modernization. Payment system modernization has become a prominent part of the financial sector reform agenda in many advanced and developing countries. Greater use of electronic payments rather than cash and other paper-based instruments would have important economic and social benefits, including lower costs and thereby increased economic efficiency and higher incomes, while broadening access to the financial system, notably for people with moderate and low incomes. The dissertation starts with a general introduction on retail payments. Chapter 1 develops a theoretical model for measuring payments costs, and applies the model to Guyana—an emerging market in the midst of the transition from paper to electronic payments. Using primary survey data from Guyanese consumers, the results of the analysis indicate that annual costs related to the use of cash by consumers reach 2.5 percent of the country’s GDP. Switching to electronic payment instruments would provide savings amounting to 1 percent of GDP per year. Chapter 2 broadens the analysis to calculate the macroeconomic impacts of a move to electronic payments. Using a unique panel dataset of 76 countries across the 17-year span from 1998 to 2014 and a pooled OLS country fixed effects model, Chapter 2 finds that on average, use of debit and credit cards contribute USD 16.2 billion to annual global consumption, and USD 160 billion to overall annual global GDP. Chapter 3 provides an in-depth assessment of the Albanian payment cards and remittances market and recommends a set of incentives and regulations (both carrots and sticks) that would allow the country to modernize its payment system. Finally, the conclusion summarizes the lessons of the dissertation’s research and brings forward issues to be explored by future research in the retail payments area.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Developments in theory and experiment have raised the prospect of an electronic technology based on the discrete nature of electron tunnelling through a potential barrier. This thesis deals with novel design and analysis tools developed to study such systems. Possible devices include those constructed from ultrasmall normal tunnelling junctions. These exhibit charging effects including the Coulomb blockade and correlated electron tunnelling. They allow transistor-like control of the transfer of single carriers, and present the prospect of digital systems operating at the information theoretic limit. As such, they are often referred to as single electronic devices. Single electronic devices exhibit self quantising logic and good structural tolerance. Their speed, immunity to thermal noise, and operating voltage all scale beneficially with junction capacitance. For ultrasmall junctions the possibility of room temperature operation at sub picosecond timescales seems feasible. However, they are sensitive to external charge; whether from trapping-detrapping events, externally gated potentials, or system cross-talk. Quantum effects such as charge macroscopic quantum tunnelling may degrade performance. Finally, any practical system will be complex and spatially extended (amplifying the above problems), and prone to fabrication imperfection. This summarises why new design and analysis tools are required. Simulation tools are developed, concentrating on the basic building blocks of single electronic systems; the tunnelling junction array and gated turnstile device. Three main points are considered: the best method of estimating capacitance values from physical system geometry; the mathematical model which should represent electron tunnelling based on this data; application of this model to the investigation of single electronic systems. (DXN004909)