957 resultados para DATABASES
Resumo:
Background There is a comprehensive literature on the academic outcomes (attrition and success) of students in traditional/baccalaureate nursing programs, but much less is known about the academic outcomes of students in accelerated nursing programs. The aim of this systematic review is to report on the attrition and success rates (either internal examination or NCLEX-RN) of accelerated students, compared to traditional students. Methods For the systematic review, the databases (Pubmed, Cinahl and PsychINFO) and Google Scholar were searched using the search terms ‘accelerated’ or ‘accreditation for prior learning’, ‘fast-track’ or ‘top up’ and ‘nursing’ with ‘attrition’ or ‘retention’ or ‘withdrawal’ or ‘success’ from 1994 to January 2016. All relevant articles were included, regardless of quality. Results The findings of 19 studies of attrition rates and/or success rates for accelerated students are reported. For international accelerated students, there were only three studies, which are heterogeneous, and have major limitations. One of three studies has lower attrition rates, and one has shown higher success rates, than traditional students. In contrast, another study has shown high attrition and low success for international accelerated students. For graduate accelerated students, most of the studies are high quality, and showed that they have rates similar or better than traditional students. Thus, five of six studies have shown similar or lower attrition rates. Four of these studies with graduate accelerated students and an additional seven studies of success rates only, have shown similar or better success rates, than traditional students. There are only three studies of non-university graduate accelerated students, and these had weaknesses, but were consistent in reporting higher attrition rates than traditional students. Conclusions The paucity and weakness of information available makes it unclear as to the attrition and/or success of international accelerated students in nursing programs. The good information available suggests that accelerated programs may be working reasonably well for the graduate students. However, the limited information available for non-university graduate students is weak, but consistent, in suggesting they may struggle in accelerated courses. Further studies are needed to determine the attrition and success rates of accelerated students, particularly for international and non-university graduate students.
Resumo:
Purpose: To develop a unique skin safety model (SSM) that offers a new and unified perspective on the diverse yet interconnected antecedents that contribute to a spectrum of potential iatrogenic skin injuries in older hospitalized adults. Organizing Construct: Discussion paper. Methods: A literature search of electronic databases was conducted for published articles written in English addressing skin integrity and iatrogenic skin injury in elderly hospital patients between 1960 and 2014. Findings: There is a multiplicity of literature outlining the etiology, prevention, and management of specific iatrogenic skin injuries. Complex and interrelated factors contribute to iatrogenic skin injury in the older adult, including multiple comorbidities, factors influencing healthcare delivery, and acute situational stressors. A range of injuries can result when these factors are com- plicated by skin irritants, pressure, shear, or friction; however, despite skin injuries sharing multiple ntecedents, no unified overarching skin safety conceptual model has been published. Conclusions: The SSM presented in this article offers a new, unified framework that encompasses the spectrum of antecedents to skin vulnerability as well as the spectrum of iatrogenic skin injuries that may be sustained by older acute care patients. Current skin integrity frameworks address prevention and management of specific skin injuries. In contrast, the SSM recognizes the complex interplay of patient and system factors that may result in a range of iatrogenic skin injuries. Skin safety is reconceptualized into a single model that has the potential for application at the individual patient level, as well as health-care systems and governance levels. Clinical Relevance: Skin safety is concerned with keeping skin safe from any iatrogenic skin injury, and remains an ongoing challenge for healthcare providers. A conceptual framework that encompasses all of the factors that may contribute to a range of iatrogenic skin injuries is essential, and guides the clinician in maintaining skin integrity in the vulnerable older patient.
Resumo:
Mobile applications are being increasingly deployed on a massive scale in various mobile sensor grid database systems. With limited resources from the mobile devices, how to process the huge number of queries from mobile users with distributed sensor grid databases becomes a critical problem for such mobile systems. While the fundamental semantic cache technique has been investigated for query optimization in sensor grid database systems, the problem is still difficult due to the fact that more realistic multi-dimensional constraints have not been considered in existing methods. To solve the problem, a new semantic cache scheme is presented in this paper for location-dependent data queries in distributed sensor grid database systems. It considers multi-dimensional constraints or factors in a unified cost model architecture, determines the parameters of the cost model in the scheme by using the concept of Nash equilibrium from game theory, and makes semantic cache decisions from the established cost model. The scenarios of three factors of semantic, time and locations are investigated as special cases, which improve existing methods. Experiments are conducted to demonstrate the semantic cache scheme presented in this paper for distributed sensor grid database systems.
Resumo:
- Objectives To explore if active learning principles be applied to nursing bioscience assessments and will this influence student perception of confidence in applying theory to practice? - Design and Data Sources A review of the literature utilising searches of various databases including CINAHL, PUBMED, Google Scholar and Mosby's Journal Index. - Methods The literature search identified research from twenty-six original articles, two electronic books, one published book and one conference proceedings paper. - Results Bioscience has been identified as an area that nurses struggle to learn in tertiary institutions and then apply to clinical practice. A number of problems have been identified and explored that may contribute to this poor understanding and retention. University academics need to be knowledgeable of innovative teaching and assessing modalities that focus on enhancing student learning and address the integration issues associated with the theory practice gap. Increased bioscience education is associated with improved patient outcomes therefore by addressing this “bioscience problem” and improving the integration of bioscience in clinical practice there will subsequently be an improvement in health care outcomes. - Conclusion From the literature several themes were identified. First there are many problems with teaching nursing students bioscience education. These include class sizes, motivation, concentration, delivery mode, lecturer perspectives, student's previous knowledge, anxiety, and a lack of confidence. Among these influences the type of assessment employed by the educator has not been explored or identified as a contributor to student learning specifically in nursing bioscience instruction. Second that educating could be achieved more effectively if active learning principles were applied and the needs and expectations of the student were met. Lastly, assessment influences student retention and the student experience and as such assessment should be congruent with the subject content, align with the learning objectives and be used as a stimulus tool for learning.
Resumo:
The hype cycle model traces the evolution of technological innovations as they pass through successive stages pronounced by the peak, disappointment, and recovery of expectations. Since its introduction by Gartner nearly two decades ago, the model has received growing interest from practitioners, and more recently from scholars. Given the model's proclaimed capacity to forecast technological development, an important consideration for organizations in formulating marketing strategies, this paper provides a critical review of the hype cycle model by seeking evidence from Gartner's own technology databases for the manifestation of hype cycles. The results of our empirical work show incongruences connected with the reports of Gartner, which motivates us to consider possible future directions, whereby the notion of hype or hyped dynamics (though not necessarily the hype cycle model itself) can be captured in existing life cycle models through the identification of peak, disappointment, and recovery patterns.
Resumo:
We study how probabilistic reasoning and inductive querying can be combined within ProbLog, a recent probabilistic extension of Prolog. ProbLog can be regarded as a database system that supports both probabilistic and inductive reasoning through a variety of querying mechanisms. After a short introduction to ProbLog, we provide a survey of the different types of inductive queries that ProbLog supports, and show how it can be applied to the mining of large biological networks.
Resumo:
Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches. Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine. Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors. Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.
Resumo:
Owing to high evolutionary divergence, it is not always possible to identify distantly related protein domains by sequence search techniques. Intermediate sequences possess sequence features of more than one protein and facilitate detection of remotely related proteins. We have demonstrated recently the employment of Cascade PSI-BLAST where we perform PSI-BLAST for many 'generations', initiating searches from new homologues as well. Such a rigorous propagation through generations of PSI-BLAST employs effectively the role of intermediates in detecting distant similarities between proteins. This approach has been tested on a large number of folds and its performance in detecting superfamily level relationships is similar to 35% better than simple PSI-BLAST searches. We present a web server for this search method that permits users to perform Cascade PSI-BLAST searches against the Pfam, SCOP and SwissProt databases. The URL for this server is http://crick.mbu.iisc.ernet.in/similar to CASCADE/CascadeBlast.html.
Resumo:
Multimedia mining primarily involves, information analysis and retrieval based on implicit knowledge. The ever increasing digital image databases on the Internet has created a need for using multimedia mining on these databases for effective and efficient retrieval of images. Contents of an image can be expressed in different features such as Shape, Texture and Intensity-distribution(STI). Content Based Image Retrieval(CBIR) is an efficient retrieval of relevant images from large databases based on features extracted from the image. Most of the existing systems either concentrate on a single representation of all features or linear combination of these features. The paper proposes a CBIR System named STIRF (Shape, Texture, Intensity-distribution with Relevance Feedback) that uses a neural network for nonlinear combination of the heterogenous STI features. Further the system is self-adaptable to different applications and users based upon relevance feedback. Prior to retrieval of relevant images, each feature is first clustered independent of the other in its own space and this helps in matching of similar images. Testing the system on a database of images with varied contents and intensive backgrounds showed good results with most relevant images being retrieved for a image query. The system showed better and more robust performance compared to existing CBIR systems
Resumo:
The recent spurt of research activities in Entity-Relationship Approach to databases calls for a close scrutiny of the semantics of the underlying Entity-Relationship models, data manipulation languages, data definition languages, etc. For reasons well known, it is very desirable and sometimes imperative to give formal description of the semantics. In this paper, we consider a specific ER model, the generalized Entity-Relationship model (without attributes on relationships) and give denotational semantics for the model as well as a simple ER algebra based on the model. Our formalism is based on the Vienna Development Method—the meta language (VDM). We also discuss the salient features of the given semantics in detail and suggest directions for further work.
Resumo:
Many real-time database applications arise in electronic financial services, safety-critical installations and military systems where enforcing security is crucial to the success of the enterprise. For real-time database systems supporting applications with firm deadlines, we investigate here the performance implications, in terms of killed transactions, of guaranteeing multilevel secrecy. In particular, we focus on the concurrency control (CC) aspects of this issue. Our main contributions are the following: First, we identify which among the previously proposed real-time CC protocols are capable of providing covert-channel-free security. Second, using a detailed simulation model, we profile the real-time performance of a representative set of these secure CC protocols for a variety of security-classified workloads and system configurations. Our experiments show that a prioritized optimistic CC protocol, OPT-WAIT, provides the best overall performance. Third, we propose and evaluate a novel "dual-CC" approach that allows the real-time database system to simultaneously use different CC mechanisms for guaranteeing security and for improving real-time performance. By appropriately choosing these different mechanisms, concurrency control protocols that provide even better performance than OPT-WAIT are designed. Finally, we propose and evaluate GUARD, an adaptive admission-control policy designed to provide fairness with respect to the distribution of killed transactions across security levels. Our experiments show that GUARD efficiently provides close to ideal fairness for real-time applications that can tolerate covert channel bandwidths of upto one bit per second.
Resumo:
Merkittävä osa alkuperäislääkevalmistajien tutkimus- ja tuotekehityskuluista näyttää olevan suunnattu olemassa olevien lääkkeiden kehittämiseen. Tämä voi oletettavasti johtaa kiinnostaviin formulaatiokehitysstrategioihin. Tutkimuksen tarkoituksena oli selvittää, voidaanko farmaseuttisen tuotekehityksen trendejä havaita myönnettyjen myyntilupien perusteella. Tutkimuksen mielenkiinnon kohteena olivat myös suurimpien lääkeyritysten käyttämät elinkaaren hallinnan keinot, joilla suojataan myyvimpiä tuotteita geneeriseltä kilpailulta ja varmistetaan markkinaosuus. Tutkimuksen painopiste oli kiinteissä oraalisissa lääkevalmisteissa. Laadullisten ja määrällisten menetelmien yhdistelmää käytettiin laajan näkökulman saamiseksi tutkittavaan aiheeseen. Suomalaisten myyntilupaviranomaisten haastatteluja käytettiin keräämään taustatietoa tutkimuksen määrällistä osaa varten. Määrällinen osa koostui myyntilupatietokannoista, jotka käsittivät kaikkien menettelyjen kautta Suomessa myönnetyt myyntiluvat, keskitetyn menettelyn kautta EU:ssa myönnetyt myyntiluvat ja maailman kymmenen suurinta lääkeyritystä USA:ssa. Tutkimustulosten perusteella rinnakkaislääkkeiden määrässä tapahtui merkittävä nousu Suomessa kaikkien menettelyjen kautta myönnetyissä myyntiluvissa ja EU:ssa keskitetyn menettelyn kautta myönnetyissä myyntiluvissa vuosina 2000-2010. Tämä muutos saattaa ainakin osaksi johtua lainsäädännöllisistä muutoksista, joilla luotiin kannustimia rinnakkaislääkkeiden käyttöön ja valmistukseen, kuten lääkevaihto ja viitehintajärjestelmä. USA:n tiedot osoittivat suurten lääkevalmistajien kiinnostuksen elinkaaren hallintaan: suurin osa maailman kymmenelle suurimmalle lääkeyritykselle myönnetyistä myyntiluvista vuosina 2005-2010 oli tähän tarkoitukseen. Elinkaaren hallinnan suhde uusiin lääkeaineisiin oli lähes 4:1. Kiinteä oraalinen lääkemuoto on kiistatta kaikista suosituin tapa annostella lääke, minkä vahvistivat sekä arvioijien haastattelut että myyntilupatiedot. Kiinteiden oraalisten rooli oli entistäkin korostuneempi rinnakkaislääkkeiden kohdalla. Kun innovatiivisuutta mitattiin epätyypillisten annosmuotojen määrällä, USA:n tiedot kiinteistä oraalisista lääkemuodoista osoittivat vahvaa innovatiivisuutta Suomen ja EU:n tietoihin verrattuna. Tämä saattaa heijastaa suurten lääkeyritysten innovatiivista tuotevalikoimaa. Epätyypillisten kiinteiden oraalisten annosmuotojen osuus oli huomattavasti pienempi rinnakkaislääkkeissä kuin alkuperäislääkkeissä kaikilla alueilla. Elinkaaren hallinnassa käytetyimmät strategiat olivat uusi formulaatio, uusi vahvuus ja uusi yhdistelmä olemassa olevasta valmisteesta. Kiinteiden oraalisten lääkemuotojen osalta kaksi kolmasosaa uusista elinkaaren hallinnan formulaatioista oli säädellysti vapauttavia valmisteita. Elinkaaren hallinta on olennainen osa suurten lääkeyritysten liiketoimintastrategiaa, ja sen tärkeyttä havainnollistettiin Coreg-tablettien tapausesimerkillä.
Resumo:
Use of adverse drug combinations, abuse of medicinal drugs and substance abuse are considerable social problems that are difficult to study. Prescription database studies might fail to incorporate factors like use of over-the-counter drugs and patient compliance, and spontaneous reporting databases suffer from underreporting. Substance abuse and smoking studies might be impeded by poor participation activity and reliability. The Forensic Toxicology Unit at the University of Helsinki is the only laboratory in Finland that performs forensic toxicology related to cause-of-death investigations comprising the analysis of over 6000 medico-legal cases yearly. The analysis repertoire covers most commonly used drugs and drugs of abuse, and the ensuing database contains also background information and information extracted from the final death certificate. In this thesis, the data stored in this comprehensive post-mortem toxicology database was combined with additional metabolite and genotype analyses that were performed to complete the profile of selected cases. The incidence of drug combinations possessing serious adverse drug interactions was generally low (0.71%), but it was notable for the two individually studied drugs, a common anticoagulant warfarin (33%) and a new generation antidepressant venlafaxine (46%). Serotonin toxicity and adverse cardiovascular effects were the most prominent possible adverse outcomes. However, the specific role of the suspected adverse drug combinations was rarely recognized in the death certificates. The frequency of bleeds was observed to be elevated when paracetamol and warfarin were used concomitantly. Pharmacogenetic factors did not play a major role in fatalities related to venlafaxine, but the presence of interacting drugs was more common in cases showing high venlafaxine concentrations. Nicotine findings in deceased young adults were roughly three times more prevalent than the smoking frequency estimation of living population. Contrary to previous studies, no difference in the proportion of suicides was observed between nicotine users and non-nicotine users. However, findings of abused substances, including abused prescription drugs, were more common in the nicotine users group than in the non-nicotine users group. The results of the thesis are important for forensic and clinical medicine, as well as for public health. The possibility of drug interactions and pharmacogenetic issues should be taken into account in cause-of-death investigations, especially in unclear cases, medical malpractice suspicions and cases where toxicological findings are scarce. Post-mortem toxicological epidemiology is a new field of research that can help to reveal problems in drug use and prescription practises.
Resumo:
This study discusses the scope of historical earthquake analysis in low-seismicity regions. Examples of non-damaging earthquake reports are given from the Eastern Baltic (Fennoscandian) Shield in north-eastern Europe from the 16th to the 19th centuries. The information available for past earthquakes in the region is typically sparse and cannot be increased through a careful search of the archives. This study applies recommended rigorous methodologies of historical seismology developed using ample data to the sparse reports from the Eastern Baltic Shield. Attention is paid to the context of reporting, the identity and role of the authors, the circumstances of the reporting, and the opportunity to verify the available information by collating the sources. We evaluate the reliability of oral earthquake recollections and develop criteria for cases when a historical earthquake is attested to by a single source. We propose parametric earthquake scenarios as a way to deal with sparse macroseismic reports and as an improvement to existing databases.
Resumo:
Gene mapping is a systematic search for genes that affect observable characteristics of an organism. In this thesis we offer computational tools to improve the efficiency of (disease) gene-mapping efforts. In the first part of the thesis we propose an efficient simulation procedure for generating realistic genetical data from isolated populations. Simulated data is useful for evaluating hypothesised gene-mapping study designs and computational analysis tools. As an example of such evaluation, we demonstrate how a population-based study design can be a powerful alternative to traditional family-based designs in association-based gene-mapping projects. In the second part of the thesis we consider a prioritisation of a (typically large) set of putative disease-associated genes acquired from an initial gene-mapping analysis. Prioritisation is necessary to be able to focus on the most promising candidates. We show how to harness the current biomedical knowledge for the prioritisation task by integrating various publicly available biological databases into a weighted biological graph. We then demonstrate how to find and evaluate connections between entities, such as genes and diseases, from this unified schema by graph mining techniques. Finally, in the last part of the thesis, we define the concept of reliable subgraph and the corresponding subgraph extraction problem. Reliable subgraphs concisely describe strong and independent connections between two given vertices in a random graph, and hence they are especially useful for visualising such connections. We propose novel algorithms for extracting reliable subgraphs from large random graphs. The efficiency and scalability of the proposed graph mining methods are backed by extensive experiments on real data. While our application focus is in genetics, the concepts and algorithms can be applied to other domains as well. We demonstrate this generality by considering coauthor graphs in addition to biological graphs in the experiments.