938 resultados para Open Data, Bologna
Resumo:
Background
The power of the randomised controlled trial depends upon its capacity to operate in a closed system whereby the intervention is the only causal force acting upon the experimental group and absent in the control group, permitting a valid assessment of intervention efficacy. Conversely, clinical arenas are open systems where factors relating to context, resources, interpretation and actions of individuals will affect implementation and effectiveness of interventions. Consequently, the comparator (usual care) can be difficult to define and variable in multi-centre trials. Hence outcomes cannot be understood without considering usual care and factors that may affect implementation and impact on the intervention.
Methods
Using a fieldwork approach, we describe PICU context, ‘usual’ practice in sedation and weaning from mechanical ventilation, and factors affecting implementation prior to designing a trial involving a sedation and ventilation weaning intervention. We collected data from 23 UK PICUs between June and November 2014 using observation, individual and multi-disciplinary group interviews with staff.
Results
Pain and sedation practices were broadly similar in terms of drug usage and assessment tools. Sedation protocols linking assessment to appropriate titration of sedatives and sedation holds were rarely used (9 % and 4 % of PICUs respectively). Ventilator weaning was primarily a medical-led process with 39 % of PICUs engaging senior nurses in the process: weaning protocols were rarely used (9 % of PICUs). Weaning methods were variably based on clinician preference. No formal criteria or use of spontaneous breathing trials were used to test weaning readiness. Seventeen PICUs (74 %) had prior engagement in multi-centre trials, but limited research nurse availability. Barriers to previous trial implementation were intervention complexity, lack of belief in the evidence and inadequate training. Facilitating factors were senior staff buy-in and dedicated research nurse provision.
Conclusions
We examined and identified contextual and organisational factors that may impact on the implementation of our intervention. We found usual practice relating to sedation, analgesia and ventilator weaning broadly similar, yet distinctively different from our proposed intervention, providing assurance in our ability to evaluate intervention effects. The data will enable us to develop an implementation plan; considering these factors we can more fully understand their impact on study outcomes.
Resumo:
OBJECTIVES: Evaluate current data sharing activities of UK publicly funded Clinical Trial Units (CTUs) and identify good practices and barriers.
STUDY DESIGN AND SETTING: Web-based survey of Directors of 45 UK Clinical Research Collaboration (UKCRC)-registered CTUs.
RESULTS: Twenty-three (51%) CTUs responded: Five (22%) of these had an established data sharing policy and eight (35%) specifically requested consent to use patient data beyond the scope of the original trial. Fifteen (65%) CTUs had received requests for data, and seven (30%) had made external requests for data in the previous 12 months. CTUs supported the need for increased data sharing activities although concerns were raised about patient identification, misuse of data, and financial burden. Custodianship of clinical trial data and requirements for a CTU to align its policy to their parent institutes were also raised. No CTUs supported the use of an open access model for data sharing.
CONCLUSION: There is support within the publicly funded UKCRC-registered CTUs for data sharing, but many perceived barriers remain. CTUs are currently using a variety of approaches and procedures for sharing data. This survey has informed further work, including development of guidance for publicly funded CTUs, to promote good practice and facilitate data sharing.
Resumo:
Background: Long working hours might increase the risk of cardiovascular disease, but prospective evidence is scarce, imprecise, and mostly limited to coronary heart disease. We aimed to assess long working hours as a risk factor for incident coronary heart disease and stroke.
Methods We identified published studies through a systematic review of PubMed and Embase from inception to Aug 20, 2014. We obtained unpublished data for 20 cohort studies from the Individual-Participant-Data Meta-analysis in Working Populations (IPD-Work) Consortium and open-access data archives. We used cumulative random-effects meta-analysis to combine effect estimates from published and unpublished data.
Findings We included 25 studies from 24 cohorts in Europe, the USA, and Australia. The meta-analysis of coronary heart disease comprised data for 603 838 men and women who were free from coronary heart disease at baseline; the meta-analysis of stroke comprised data for 528 908 men and women who were free from stroke at baseline. Follow-up for coronary heart disease was 5·1 million person-years (mean 8·5 years), in which 4768 events were recorded, and for stroke was 3·8 million person-years (mean 7·2 years), in which 1722 events were recorded. In cumulative meta-analysis adjusted for age, sex, and socioeconomic status, compared with standard hours (35-40 h per week), working long hours (≥55 h per week) was associated with an increase in risk of incident coronary heart disease (relative risk [RR] 1·13, 95% CI 1·02-1·26; p=0·02) and incident stroke (1·33, 1·11-1·61; p=0·002). The excess risk of stroke remained unchanged in analyses that addressed reverse causation, multivariable adjustments for other risk factors, and different methods of stroke ascertainment (range of RR estimates 1·30-1·42). We recorded a dose-response association for stroke, with RR estimates of 1·10 (95% CI 0·94-1·28; p=0·24) for 41-48 working hours, 1·27 (1·03-1·56; p=0·03) for 49-54 working hours, and 1·33 (1·11-1·61; p=0·002) for 55 working hours or more per week compared with standard working hours (ptrend<0·0001).
Interpretation Employees who work long hours have a higher risk of stroke than those working standard hours; the association with coronary heart disease is weaker. These findings suggest that more attention should be paid to the management of vascular risk factors in individuals who work long hours.
Resumo:
OBJECTIVE:
To estimate the prevalence and distribution of open-angle glaucoma (OAG) in the United States by age, race/ethnicity, and gender.
METHODS:
Summary prevalence estimates of OAG were prepared separately for black, Hispanic, and white subjects in 5-year age intervals starting at 40 years. The estimated rates were based on a meta-analysis of recent population-based studies in the United States, Australia, and Europe. These rates were applied to 2000 US census data and to projected US population figures for 2020 to estimate the number of the US population with OAG.
RESULTS:
The overall prevalence of OAG in the US population 40 years and older is estimated to be 1.86% (95% confidence interval, 1.75%-1.96%), with 1.57 million white and 398 000 black persons affected. After applying race-, age-, and gender-specific rates to the US population as determined in the 2000 US census, we estimated that OAG affects 2.22 million US citizens. Owing to the rapidly aging population, the number with OAG will increase by 50% to 3.36 million in 2020. Black subjects had almost 3 times the age-adjusted prevalence of glaucoma than white subjects.
CONCLUSIONS:
Open-angle glaucoma affects more than 2 million individuals in the United States. Owing to the rapid aging of the US population, this number will increase to more than 3 million by 2020.
Resumo:
A first stage collision database is assembled which contains electron-impact excitation, ionization,\r and recombination rate coefficients for B, B + , B 2+ , B 3+ , and B 4+ . The first stage database\r is constructed using the R-matrix with pseudostates, time-dependent close-coupling, and perturbative\r distorted-wave methods. A second stage collision database is then assembled which contains\r generalized collisional-radiative ionization, recombination, and power loss rate coefficients as a\r function of both temperature and density. The second stage database is constructed by solution of\r the collisional-radiative equations in the quasi-static equilibrium approximation using the first\r stage database. Both collision database stages reside in electronic form at the IAEA Labeled Atomic\r Data Interface (ALADDIN) database and the Atomic Data Analysis Structure (ADAS) open database.
Resumo:
Astrophysics is driven by observations, and in the present era there are a wealth of state-of-the-art ground-based and satellite facilities. The astrophysical spectra emerging from these are of exceptional quality and quantity and cover a broad wavelength range. To meaningfully interpret these spectra, astronomers employ highly complex modelling codes to simulate the astrophysical observations. Important input to these codes include atomic data such as excitation rates, photoionization cross sections, oscillator strengths, transition probabilities and energy levels/line wavelengths. Due to the relatively low temperatures associated with many astrophysical plasmas, the accurate determination of electron-impact excitation rates in the low energy region is essential in generating a reliable spectral synthesis. Hence it is these atomic data, and the main computational methods used to evaluate them, which we focus on in this publication. We consider in particular the complicated open d- shell structures of the Fe-peak ions in low ionization stages. While some of these data can be obtained experimentally, they are usually of insufficient accuracy or limited to a small number of transitions.
Resumo:
PURPOSE: Glaucoma patients are still at risk of becoming blind. It is of clinical significance to determine the risk of blindness and its causes to prevent its occurrence. This systematic review estimates the number of treated glaucoma patients with end-of-life visual impairment (VI) and blindness and the factors that are associated with this.
METHODS: A systematic literature search in relevant databases was conducted in August 2014 on end-of-life VI. A total of 2574 articles were identified, of which 5 on end-of-life VI. Several data items were extracted from the reports and presented in tables.
RESULTS: All studies had a retrospective design. A considerable number of glaucoma patients were found to be blind at the end of their life; with up to 24% unilateral and 10% bilateral blindness. The following factors were associated with blindness: (1) baseline severity of visual field loss: advanced stage of glaucoma or substantial visual field loss at the initial visit; (2) factors influencing progression: fluctuation of intraocular pressure (IOP) during treatment, presence of pseudoexfoliation, poor patient compliance, higher IOP; (3) longer time period: longer duration of disease and older age at death because of a longer life expectancy; and (4) coexistence of other ocular pathology.
CONCLUSIONS: Further prevention of blindness in glaucoma patients is needed. To reach this goal, it is important to address the risk factors for blindness identified in this review, especially those that can be modified, such as advanced disease at diagnosis, high and fluctuating IOP, and poor compliance.
Resumo:
Wireless communication technologies have become widely adopted, appearing in heterogeneous applications ranging from tracking victims, responders and equipments in disaster scenarios to machine health monitoring in networked manufacturing systems. Very often, applications demand a strictly bounded timing response, which, in distributed systems, is generally highly dependent on the performance of the underlying communication technology. These systems are said to have real-time timeliness requirements since data communication must be conducted within predefined temporal bounds, whose unfulfillment may compromise the correct behavior of the system and cause economic losses or endanger human lives. The potential adoption of wireless technologies for an increasingly broad range of application scenarios has made the operational requirements more complex and heterogeneous than before for wired technologies. On par with this trend, there is an increasing demand for the provision of cost-effective distributed systems with improved deployment, maintenance and adaptation features. These systems tend to require operational flexibility, which can only be ensured if the underlying communication technology provides both time and event triggered data transmission services while supporting on-line, on-the-fly parameter modification. Generally, wireless enabled applications have deployment requirements that can only be addressed through the use of batteries and/or energy harvesting mechanisms for power supply. These applications usually have stringent autonomy requirements and demand a small form factor, which hinders the use of large batteries. As the communication support may represent a significant part of the energy requirements of a station, the use of power-hungry technologies is not adequate. Hence, in such applications, low-range technologies have been widely adopted. In fact, although low range technologies provide smaller data rates, they spend just a fraction of the energy of their higher-power counterparts. The timeliness requirements of data communications, in general, can be met by ensuring the availability of the medium for any station initiating a transmission. In controlled (close) environments this can be guaranteed, as there is a strict regulation of which stations are installed in the area and for which purpose. Nevertheless, in open environments, this is hard to control because no a priori abstract knowledge is available of which stations and technologies may contend for the medium at any given instant. Hence, the support of wireless real-time communications in unmanaged scenarios is a highly challenging task. Wireless low-power technologies have been the focus of a large research effort, for example, in the Wireless Sensor Network domain. Although bringing extended autonomy to battery powered stations, such technologies are known to be negatively influenced by similar technologies contending for the medium and, especially, by technologies using higher power transmissions over the same frequency bands. A frequency band that is becoming increasingly crowded with competing technologies is the 2.4 GHz Industrial, Scientific and Medical band, encompassing, for example, Bluetooth and ZigBee, two lowpower communication standards which are the base of several real-time protocols. Although these technologies employ mechanisms to improve their coexistence, they are still vulnerable to transmissions from uncoordinated stations with similar technologies or to higher power technologies such as Wi- Fi, which hinders the support of wireless dependable real-time communications in open environments. The Wireless Flexible Time-Triggered Protocol (WFTT) is a master/multi-slave protocol that builds on the flexibility and timeliness provided by the FTT paradigm and on the deterministic medium capture and maintenance provided by the bandjacking technique. This dissertation presents the WFTT protocol and argues that it allows supporting wireless real-time communication services with high dependability requirements in open environments where multiple contention-based technologies may dispute the medium access. Besides, it claims that it is feasible to provide flexible and timely wireless communications at the same time in open environments. The WFTT protocol was inspired on the FTT paradigm, from which higher layer services such as, for example, admission control has been ported. After realizing that bandjacking was an effective technique to ensure the medium access and maintenance in open environments crowded with contention-based communication technologies, it was recognized that the mechanism could be used to devise a wireless medium access protocol that could bring the features offered by the FTT paradigm to the wireless domain. The performance of the WFTT protocol is reported in this dissertation with a description of the implemented devices, the test-bed and a discussion of the obtained results.
Resumo:
Objective To explore people's experiences of starting antidepressant treatment. Design Qualitative interpretive approach combining thematic analysis with constant comparison. Relevant coding reports from the original studies (generated using NVivo) relating to initial experiences of antidepressants were explored in further detail, focusing on the ways in which participants discussed their experiences of taking or being prescribed an antidepressant for the first time. Participants 108 men and women aged 22–84 who had taken antidepressants for depression. Setting Respondents recruited throughout the UK during 2003–2004 and 2008 and 2012–2013 and in Australia during 2010–2011. Results People expressed a wide range of feelings about initiating antidepressant use. People's attitudes towards starting antidepressant use were shaped by stereotypes and stigmas related to perceived drug dependency and potentially extreme side effects. Anxieties were expressed about starting use, and about how long the antidepressant might begin to take effect, how much it might help or hinder them, and about what to expect in the initial weeks. People worried about the possibility of experiencing adverse effects and implications for their senses of self. Where people felt they had not been given sufficient time during their consultation information or support to take the medicines, the uncertainty could be particularly unsettling and impact on their ongoing views on and use of antidepressants as a viable treatment option. Conclusions Our paper is the first to explore in-depth patient existential concerns about start of antidepressant use using multicountry data. People need additional support when they make decisions about starting antidepressants. Health professionals can use our findings to better understand and explore with patients’ their concerns before their patients start antidepressants. These insights are key to supporting patients, many of whom feel intimidated by the prospect of taking antidepressants, especially during the uncertain first few weeks of treatment.
Resumo:
The broad capabilities of current mobile devices have paved the way for Mobile Crowd Sensing (MCS) applications. The success of this emerging paradigm strongly depends on the quality of received data which, in turn, is contingent to mass user participation; the broader the participation, the more useful these systems become. However, there is an ongoing trend that tries to integrate MCS applications with emerging computing paradigms such as cloud computing. The intuition is that such a transition can significantly improve the overall efficiency while at the same time it offers stronger security and privacy-preserving mechanisms for the end-user. In this position paper, we dwell on the underpinnings of incorporating cloud computing techniques to facilitate the vast amount of data collected in MCS applications. That is, we present a list of core system, security and privacy requirements that must be met if such a transition is to be successful. To this end, we first address several competing challenges not previously considered in the literature such as the scarce energy resources of battery-powered mobile devices as well as their limited computational resources that they often prevent the use of computationally heavy cryptographic operations and thus offering limited security services to the end-user. Finally, we present a use case scenario as a comprehensive example. Based on our findings, we posit open issues and challenges, and discuss possible ways to address them, so that security and privacy do not hinder the migration of MCS systems to the cloud.
Resumo:
Trabalho final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações
Resumo:
É possível assistir nos dias de hoje, a um processo tecnológico evolutivo acentuado por toda a parte do globo. No caso das empresas, quer as pequenas, médias ou de grandes dimensões, estão cada vez mais dependentes dos sistemas informatizados para realizar os seus processos de negócio, e consequentemente à geração de informação referente aos negócios e onde, muitas das vezes, os dados não têm qualquer relacionamento entre si. A maioria dos sistemas convencionais informáticos não são projetados para gerir e armazenar informações estratégicas, impossibilitando assim que esta sirva de apoio como recurso estratégico. Portanto, as decisões são tomadas com base na experiência dos administradores, quando poderiam serem baseadas em factos históricos armazenados pelos diversos sistemas. Genericamente, as organizações possuem muitos dados, mas na maioria dos casos extraem pouca informação, o que é um problema em termos de mercados competitivos. Como as organizações procuram evoluir e superar a concorrência nas tomadas de decisão, surge neste contexto o termo Business Intelligence(BI). A GisGeo Information Systems é uma empresa que desenvolve software baseado em SIG (sistemas de informação geográfica) recorrendo a uma filosofia de ferramentas open-source. O seu principal produto baseia-se na localização geográfica dos vários tipos de viaturas, na recolha de dados, e consequentemente a sua análise (quilómetros percorridos, duração de uma viagem entre dois pontos definidos, consumo de combustível, etc.). Neste âmbito surge o tema deste projeto que tem objetivo de dar uma perspetiva diferente aos dados existentes, cruzando os conceitos BI com o sistema implementado na empresa de acordo com a sua filosofia. Neste projeto são abordados alguns dos conceitos mais importantes adjacentes a BI como, por exemplo, modelo dimensional, data Warehouse, o processo ETL e OLAP, seguindo a metodologia de Ralph Kimball. São também estudadas algumas das principais ferramentas open-source existentes no mercado, assim como quais as suas vantagens/desvantagens relativamente entre elas. Em conclusão, é então apresentada a solução desenvolvida de acordo com os critérios enumerados pela empresa como prova de conceito da aplicabilidade da área Business Intelligence ao ramo de Sistemas de informação Geográfica (SIG), recorrendo a uma ferramenta open-source que suporte visualização dos dados através de dashboards.
Resumo:
A importância dos sistemas de data warehousing e business intelligence é cada vez mais pronunciada, no sentido de dotar as organizações com a capacidade de guardar, explorar e produzir informação de valor acrescido para os seus processos de tomada de decisão. Esta realidade é claramente aplicável aos sectores da administração pública portuguesa e, muito em particular, aos organismos com responsabilidades centrais no Ministério da Saúde. No caso dos Serviços Partilhados do Ministério da Saúde (SPMS), que tem como missão prover o SNS de sistemas centrais de business intelligence, o apelo dos seus clientes, para que possam contar com capacidades analíticas nos seus sistemas centrais, tem sido sentido de forma muito acentuada. Todavia, é notório que, tanto os custos, como a complexidade, de grande parte destes projetos têm representado uma séria ameaça à sua adoção e sucesso. Por um lado, a administração pública tem recebido um forte encorajamento para integrar e adotar soluções de natureza open source (modelo de licenciamento gratuito), para os seus projetos de sistemas de informação. Por outro lado, temos vindo a assistir a uma vaga de aceitação generalizada de novas metodologias de desenvolvimento de projetos informáticos, nomeadamente no que diz respeito às metodologias Agéis, que se assumem como mais flexíveis, menos formais e com maior grau de sucesso. No sentido de averiguar da aplicabilidade do open source e das metodologias Ágeis aos sistemas de business intelligence, este trabalho documenta a implementação de um projeto organizacional para a SPMS, com recurso a ferramentas open source de licenciamento gratuito e através de uma metodologia de desenvolvimento de natureza Ágil.
Resumo:
Stratigraphic Columns (SC) are the most useful and common ways to represent the eld descriptions (e.g., grain size, thickness of rock packages, and fossil and lithological components) of rock sequences and well logs. In these representations the width of SC vary according to the grain size (i.e., the wider the strata, the coarser the rocks (Miall 1990; Tucker 2011)), and the thickness of each layer is represented at the vertical axis of the diagram. Typically these representations are drawn 'manually' using vector graphic editors (e.g., Adobe Illustrator®, CorelDRAW®, Inskape). Nowadays there are various software which automatically plot SCs, but there are not versatile open-source tools and it is very di cult to both store and analyse stratigraphic information. This document presents Stratigraphic Data Analysis in R (SDAR), an analytical package1 designed for both plotting and facilitate the analysis of Stratigraphic Data in R (R Core Team 2014). SDAR, uses simple stratigraphic data and takes advantage of the exible plotting tools available in R to produce detailed SCs. The main bene ts of SDAR are: (i) used to generate accurate and complete SC plot including multiple features (e.g., sedimentary structures, samples, fossil content, color, structural data, contacts between beds), (ii) developed in a free software environment for statistical computing and graphics, (iii) run on a wide variety of platforms (i.e., UNIX, Windows, and MacOS), (iv) both plotting and analysing functions can be executed directly on R's command-line interface (CLI), consequently this feature enables users to integrate SDAR's functions with several others add-on packages available for R from The Comprehensive R Archive Network (CRAN).
Resumo:
This paper studies strategies to attract students from outside Europe to European preexperience masters. We characterize the value added by such masters through interviews with key players at the universities and multinational recruiting corporations. We considered a strategy for segmenting international students in the US and extended it to the European market. We have analyzed data from international applications to Nova SBE as a proxy for applications in European institutions. Based on that analysis we conclude with recommendations to attract suitable candidates from outside Europe. In particular we also provided three different solutions to attract students from the southern hemisphere: we conclude that European institutions should (a) increase the spring semester intake, (b) provide bridging courses for some students, or (c) could place some accepted candidates in internships before starting classes.