998 resultados para Electronic funds transfers.
Resumo:
This study explored the factors associated with state-level allocations to tobacco-control programs. The primary research question was whether public sentiment regarding tobacco control was a significant factor in the states' 2001 budget decisions. In addition to public opinion, several additional political and economic measures were considered. Significant associations were found between our outcome, state-level tobacco-control funding per capita, and key variables of interest including public opinion, amount of tobacco settlement received, the party affiliation of the governor, the state's smoking rate, excise tax revenue received, and whether the state was a major producer of tobacco. The findings from this study supported our hypothesis that states with citizens who favor more restrictive indoor air policies allocate more to tobacco control. Effective public education to change public opinion and the cultural norms surrounding smoking may affect political decisions and, in turn, increase funding for crucial public health programs.
Resumo:
Successfully predicting the frequency dispersion of electronic hyperpolarizabilities is an unresolved challenge in materials science and electronic structure theory. We show that the generalized Thomas-Kuhn sum rules, combined with linear absorption data and measured hyperpolarizability at one or two frequencies, may be used to predict the entire frequency-dependent electronic hyperpolarizability spectrum. This treatment includes two- and three-level contributions that arise from the lowest two or three excited electronic state manifolds, enabling us to describe the unusual observed frequency dispersion of the dynamic hyperpolarizability in high oscillator strength M-PZn chromophores, where (porphinato)zinc(II) (PZn) and metal(II)polypyridyl (M) units are connected via an ethyne unit that aligns the high oscillator strength transition dipoles of these components in a head-to-tail arrangement. We show that some of these structures can possess very similar linear absorption spectra yet manifest dramatically different frequency dependent hyperpolarizabilities, because of three-level contributions that result from excited state-to excited state transition dipoles among charge polarized states. Importantly, this approach provides a quantitative scheme to use linear optical absorption spectra and very limited individual hyperpolarizability measurements to predict the entire frequency-dependent nonlinear optical response. Copyright © 2010 American Chemical Society.
Resumo:
BACKGROUND: Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. METHODS AND PRINCIPAL FINDINGS: The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. CONCLUSIONS: Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks.
Resumo:
The Haemophilus influenzae HMW1 adhesin is a high-molecular weight protein that is secreted by the bacterial two-partner secretion pathway and mediates adherence to respiratory epithelium, an essential early step in the pathogenesis of H. influenzae disease. In recent work, we discovered that HMW1 is a glycoprotein and undergoes N-linked glycosylation at multiple asparagine residues with simple hexose units rather than N-acetylated hexose units, revealing an unusual N-glycosidic linkage and suggesting a new glycosyltransferase activity. Glycosylation protects HMW1 against premature degradation during the process of secretion and facilitates HMW1 tethering to the bacterial surface, a prerequisite for HMW1-mediated adherence. In the current study, we establish that the enzyme responsible for glycosylation of HMW1 is a protein called HMW1C, which is encoded by the hmw1 gene cluster and shares homology with a group of bacterial proteins that are generally associated with two-partner secretion systems. In addition, we demonstrate that HMW1C is capable of transferring glucose and galactose to HMW1 and is also able to generate hexose-hexose bonds. Our results define a new family of bacterial glycosyltransferases.
Resumo:
This volume originated in HASTAC’s first international conference, “Electronic Techtonics: Thinking at the Interface,” held at Duke University during April 19-21, 2007. “Electronic Techtonics” was the site of truly unforgettable conversations and encounters that traversed domains, disciplines, and media – conversations that explored the fluidity of technology both as interface as well as at the interface. This hardcopy version of the conference proceedings is published in conjunction with its electronic counterpart (found at www.hastac.org). Both versions exist as records of the range and depth of conversations that took place at the conference. Some of the papers in this volume are almost exact records of talks given at the conference, while others are versions that were revised and reworked some time after the conference. These papers are drawn from a variety of fields and we have not made an effort to homogenize them in any way, but have instead retained the individual format and style of each author.
Resumo:
BACKGROUND: The Affordable Care Act encourages healthcare systems to integrate behavioral and medical healthcare, as well as to employ electronic health records (EHRs) for health information exchange and quality improvement. Pragmatic research paradigms that employ EHRs in research are needed to produce clinical evidence in real-world medical settings for informing learning healthcare systems. Adults with comorbid diabetes and substance use disorders (SUDs) tend to use costly inpatient treatments; however, there is a lack of empirical data on implementing behavioral healthcare to reduce health risk in adults with high-risk diabetes. Given the complexity of high-risk patients' medical problems and the cost of conducting randomized trials, a feasibility project is warranted to guide practical study designs. METHODS: We describe the study design, which explores the feasibility of implementing substance use Screening, Brief Intervention, and Referral to Treatment (SBIRT) among adults with high-risk type 2 diabetes mellitus (T2DM) within a home-based primary care setting. Our study includes the development of an integrated EHR datamart to identify eligible patients and collect diabetes healthcare data, and the use of a geographic health information system to understand the social context in patients' communities. Analysis will examine recruitment, proportion of patients receiving brief intervention and/or referrals, substance use, SUD treatment use, diabetes outcomes, and retention. DISCUSSION: By capitalizing on an existing T2DM project that uses home-based primary care, our study results will provide timely clinical information to inform the designs and implementation of future SBIRT studies among adults with multiple medical conditions.
Resumo:
College students receive a wealth of information through electronic communications that they are unable to process efficiently. This information overload negatively impacts their affect, which is officially defined in the field of psychology as the experience of feeling or emotion. To address this problem, we postulated that we could create an application that organizes and presents incoming content in a manner that optimizes users’ ability to process information. First, we conducted surveys that quantitatively measured each participant’s psychological affect while handling electronic communications, which was used to tailor the features of the application to what the user’s desire. After designing and implementing the application, we again measured the user's affect using this product. Our goal was to find that the program promoted a positive change in affect. Our application, Brevitus, was able to match Gmail on affect reduction profiles, while succeeding in implementing certain user interface specifications.
Resumo:
Developed for use with triple GEM detectors, the GEM Electronic Board (GEB) forms a crucial part of the electronics readout system being developed as part of the CMS muon upgrade program. The objective of the GEB is threefold; to provide stable powering and ground for the VFAT3 front ends, to enable high-speed communication between 24 VFAT3 front ends and an optohybrid, and to shield the GEM detector from electromagnetic interference. The paper describes the concept and design of a large-size GEB in detail, highlighting the challenges in terms of design and feasibility of this deceptively difficult system component.
Resumo:
Software technology that predicts stress in electronic systems and packages, developed as part of TCS Programme, is described. The software is closely integrated within a thermal design tool providing the ability to simulate the coupled effects of airflow, temperature and stress on product performance. This integrated approach to analysis will help decrease the number of design cycles.
Resumo:
This paper presents a proactive approach to load sharing and describes the architecture of a scheme, Concert, based on this approach. A proactive approach is characterized by a shift of emphasis from reacting to load imbalance to avoiding its occurrence. In contrast, in a reactive load sharing scheme, activity is triggered when a processing node is either overloaded or underloaded. The main drawback of this approach is that a load imbalance is allowed to develop before costly corrective action is taken. Concert is a load sharing scheme for loosely-coupled distributed systems. Under this scheme, load and task behaviour information is collected and cached in advance of when it is needed. Concert uses Linux as a platform for development. Implemented partially in kernel space and partially in user space, it achieves transparency to users and applications whilst keeping the extent of kernel modifications to a minimum. Non-preemptive task transfers are used exclusively, motivated by lower complexity, lower overheads and faster transfers. The goal is to minimize the average response-time of tasks. Concert is compared with other schemes by considering the level of transparency it provides with respect to users, tasks and the underlying operating system.
Resumo:
This paper demonstrates a modeling and design approach that couples computational mechanics techniques with numerical optimisation and statistical models for virtual prototyping and testing in different application areas concerning reliability of eletronic packages. The integrated software modules provide a design engineer in the electronic manufacturing sector with fast design and process solutions by optimizing key parameters and taking into account complexity of certain operational conditions. The integrated modeling framework is obtained by coupling the multi-phsyics finite element framework - PHYSICA - with the numerical optimisation tool - VisualDOC into a fully automated design tool for solutions of electronic packaging problems. Response Surface Modeling Methodolgy and Design of Experiments statistical tools plus numerical optimisaiton techniques are demonstrated as a part of the modeling framework. Two different problems are discussed and solved using the integrated numerical FEM-Optimisation tool. First, an example of thermal management of an electronic package on a board is illustrated. Location of the device is optimized to ensure reduced junction temperature and stress in the die subject to certain cooling air profile and other heat dissipating active components. In the second example thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and subsequently used to optimise the life-time of solder interconnects under thermal cycling.
Resumo:
This paper presents simulated computational fluid dynamics (CFD) results for comparison against experimental data. The performance of four turbulence models has been assessed for electronic application areas considering both fluid flow and heat transfer phenomenon. CFD is vast becoming a powerful and almost essential tool for design, development and optimization in engineering problems. However turbulence models remain to be the key problem issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the performance of the turbulence model employed together with the wall functions implemented. To be able to resolve the abrupt changes in the turbulent energy and other parameters near the wall a particularly fine mesh is necessary which unfortunately increases the computer storage capacity requirements. The objective of turbulence modelling is to enhance computational procdures of sufficient acccuracy and generality for engineers to anticipate the Reynolds stresses and the scalar transport terms.
Resumo:
The electronics industry is developing rapidly together with the increasingly complex problem of microelectronic equipment cooling. It has now become necessary for thermal design engineers to consider the problem of equipment cooling at some level. The use of Computational Fluid Dynamics (CFD) for such investigations is fast becoming a powerful and almost essential tool for the design, development and optimisation of engineering applications. However turbulence models remain a key issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the turbulence model employed together with the wall functions implemented. In order to resolve the abrupt fluctuations experienced by the turbulent energy and other parameters located at near wall regions and shear layers a particularly fine computational mesh is necessary which inevitably increases the computer storage and run-time requirements. This paper will discuss results from an investigation into the accuract of currently used turbulence models. Also a newly formulated transitional hybrid turbulence model will be introduced with comparisonsaagainst experimental data.
Resumo:
From the model geometry creation to the model analysis, the stages in between such as mesh generation are the most manpower intensive phase in a mesh-based computational mechanics simulation process. On the other hand the model analysis is the most computing intensive phase. Advanced computational hardware and software have significantly reduced the computing time - and more importantly the trend is downward. With the kind of models envisaged coming, which are larger, more complex in geometry and modelling, and multiphysics, there is no clear trend that the manpower intensive phase is to decrease significantly in time - in the present way of operation it is more likely to increase with model complexity. In this paper we address this dilemma in collaborating components for models in electronic packaging application.