990 resultados para publication lag time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Miltä sinusta tuntuisi nähdä vanhenevasi viisikymmentä vuotta minuutissa? Time-lapse -kuvaus on nykytekniikalla ainoa keino ihmisille matkustaa ajassa. Tällä kuvaustavalla on mahdollista hidastaa elämää tai vastaavasti nopeuttaa sitä. Tämän opinnäytetyön tarkoituksena on käydä läpi ne keinot ja tekniikat kuinka time-lapse -kuvia voidaan tehdä. Työn tavoitteena on ollut ymmärtää time-lapse -kuvaamista ja tutustua eri efekteihin, joita voi tehdä intervallikuvauksen avulla. Työhön on tehty kuvamateriaalia kirjallisen osan tueksi, jotta lukijan olisi mahdollisimman helppo ymmärtää kuvaustapaa. Työhön on tehty kaavoja ja kuvia, jotka antavat lukijalle mahdollisuuden myös kokeilla kuvien ottamista. Työssä on kerrottu kuvien ottamisesta ja siinä helposti tehtävistä virheistä, jotta lukijan ei tarvitsisi omissa kokeiluissaan ajaa samoihin ansoihin, kuin opinnäytetyötä tehdessä on ajettu. Time-lapse -kuvaaminen on kiehtova tapa tuoda kaikki valokuvauksen keinot elokuvaan. Tämä työ on tehty sitä varten, että tekijä itse on syventänyt ymmärrystään kuvaustapaan ja siksi, että lukija voisi suunnitella kuvauksiaan monipuolisemmin. Tämä työ käy time-lapse -kuvaamisen pinnalla. Syvemmällä kuvaustavan sisällä on miljoonia keinoja, joita on mahdollista löytää, kokeilla ja käyttää. Tämän työn tarkoituksena on olla alku matkalle, joka vie mukanaan kokeilemaan uusia keinoja, joita ei ehkä kuvien digitaalisuuden tuomien mahdollisuuksien myötä ole vielä edes löydetty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation describes a networking approach to infinite-dimensional systems theory, where there is a minimal distinction between inputs and outputs. We introduce and study two closely related classes of systems, namely the state/signal systems and the port-Hamiltonian systems, and describe how they relate to each other. Some basic theory for these two classes of systems and the interconnections of such systems is provided. The main emphasis lies on passive and conservative systems, and the theoretical concepts are illustrated using the example of a lossless transfer line. Much remains to be done in this field and we point to some directions for future studies as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Megillat ha-Megalleh av Abraham bar Hijja (Spanien, 12. årh.) är bäst känd som en samling av messianska beräkningar. Men boken som helhet innehåller, vid sidan om beräkningarna, varierande innehåll såsom filosofi, bibeltolkning och astrologi. Bar Hijja framför en argumentering, inför det växande inflytandet på judarna från de kristna, för att den judiska religionen, och särskilt judarnas väntan på den messianska tiden, fortfarande är giltiga. Bar Hijja utvecklar, med hjälp av den judiska traditionen, arabiska vetenskapliga och andra källor samt omdefinierade kristna ideér, en syn på historien som en determinisk händelselopp som består av goda och dåliga tider men eventuellt skall kulminera i en messiansk tid för judarna. Boken innehåller även en omfattande astrologisk kommentar på historien, som bland annat beskriver uppkomsten och historien av de kristna och muslimska världsmakterna, samt deras förhållanden med judar. Boken är tydligt avsett att övertyga judarna att förbli judar, genom att argumentera med hjälp av både judiskt och ickejudiskt material, att oavsett läget i exil har de den framtid som de väntat för. I det medeltida sammanhanget syns detta inte enbart som en religiös fråga utan också som en politisk strävan till att säkerställa det judiska samfundets framtid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finansanalytiker har en stor betydelse för finansmarknaderna, speciellt igenom att förmedla information genom resultatprognoser. Typiskt är att analytiker i viss grad är oeniga i sina resultatprognoser, och det är just denna oenighet analytiker emellan som denna avhandling studerar. Då ett företag rapporterar förluster tenderar oenigheten gällande ett företags framtid att öka. På ett intuitivt plan är det lätt att tolka detta som ökad osäkerhet. Det är även detta man finner då man studerar analytikerrapporter - analytiker ser ut att bli mer osäkra då företag börjar gå med förlust, och det är precis då som även oenigheten mellan analytikerna ökar. De matematisk-teoretiska modeller som beskriver analytikers beslutsprocesser har däremot en motsatt konsekvens - en ökad oenighet analytiker emellan kan endast uppkomma ifall analytikerna blir säkrare på ett individuellt plan, där den drivande kraften är asymmetrisk information. Denna avhandling löser motsägelsen mellan ökad säkerhet/osäkerhet som drivkraft bakom spridningen i analytikerprognoser. Genom att beakta mängden publik information som blir tillgänglig via resultatrapporter är det inte möjligt för modellerna för analytikers beslutsprocesser att ge upphov till de nivåer av prognosspridning som kan observeras i data. Slutsatsen blir därmed att de underliggande teoretiska modellerna för prognosspridning är delvis bristande och att spridning i prognoser istället mer troligt följer av en ökad osäkerhet bland analytikerna, i enlighet med vad analytiker de facto nämner i sina rapporter. Resultaten är viktiga eftersom en förståelse av osäkerhet runt t.ex. resultatrapportering bidrar till en allmän förståelse för resultatrapporteringsmiljön som i sin tur är av ytterst stor betydelse för prisbildning på finansmarknader. Vidare används typiskt ökad prognosspridning som en indikation på ökad informationsasymmetri i redovisningsforskning, ett fenomen som denna avhandling därmed ifrågasätter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores transparency in the decision-making of the European Central Bank (ECB). According to ECB´s definition, transparency means that the central bank provides the general public with all relevant information on its strategy, assessments and policy decisions as well as its procedures in an open, clear and timely manner. In this paper, however, the interpretation of transparency is somewhat broader: Information is freely available and directly accessible to those who will be affected by the decisions. Moreover, the individuals shall be able to master this material. ECB´s negative attitude towards publication of documents has demonstrated central bank´s reluctance to strive towards more extensive transparency. By virtue of the definition adopted by the ECB the bank itself is responsible for determining what is considered as relevant information. On the grounds of EU treaties, this paper assesses ECB`s accountability concentrating especially on transparency by employing principal-agent theory and constitutional approach. Traditionally, the definite mandate and the tenet of central bank independence have been used to justify the limited accountability. The de facto competence of the ECB has, however, considerably expanded as the central bank has decisively resorted to non-standard measures in order to combat the economic turbulences facing Europe. It is alleged that non-standard monetary policy constitutes a grey zone occasionally resembling economic policy or fiscal policy. Notwithstanding, the European Court of Justice has repeatedly approved these measures. This dynamic interpretation of the treaties seems to allow temporarily exceptions from the central bank´s primary objective during extraordinary times. Regardless, the paper suggests that the accountability nexus defined in the treaties is not sufficient in order to guarantee the accountability of the ECB after the adoption of the new, more active role. Enhanced transparency would help the ECB to maintain its credibility. Investing in the quality of monetary dialogue between the Parliament and the ECB appears to constitute the most adequate and practicable method to accomplish this intention. As a result of upgraded transparency the legitimacy of the central bank would not solely rest on its policy outputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Travail réalisé en cotutelle (Université de Paris IV-La Sorbonne).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time studies of the dynamics were performed on the reaction of HgI_2 in a molecular beam. Excitation was by either one or multi pump photons (311 nm), leading to two separate sets of dynamics, each of which could be investigated by a time-delayed probe laser (622 nm) that ionized the parent molecule and the fragments by REMPI processes. These dynamics were distinguished by combining the information from transients taken at each mass (HgI_2, HgI, I_2, Hg, and I) with the results of pump (and probe) power dependence studies on each mass. A method of plotting the slope of the intensity dependence against the pump-probe time delay proved essential. In the preceding publication, we detailed the dynamics of the reaction initiated by a one photon excitation to the A-continuum. Here, we present studies of higher-energy states. Multiphoton excitation accesses predissociative states of HgI_2, for which there are crossings into the symmetric and asymmetric stretch coordinates. The dynamics of these channels, which lead to atomic (I or Hg) and diatomic (HgI) fragments, are discussed and related to the nature of the intermediates along the reaction pathway.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wednesday 23rd April 2014 Speaker(s): Willi Hasselbring Organiser: Leslie Carr Time: 23/04/2014 11:00-11:50 Location: B32/3077 File size: 669 Mb Abstract For good scientific practice, it is important that research results may be properly checked by reviewers and possibly repeated and extended by other researchers. This is of particular interest for "digital science" i.e. for in-silico experiments. In this talk, I'll discuss some issues of how software systems and services may contribute to good scientific practice. Particularly, I'll present our PubFlow approach to automate publication workflows for scientific data. The PubFlow workflow management system is based on established technology. We integrate institutional repository systems (based on EPrints) and world data centers (in marine science). PubFlow collects provenance data automatically via our monitoring framework Kieker. Provenance information describes the origins and the history of scientific data in its life cycle, and the process by which it arrived. Thus, provenance information is highly relevant to repeatability and trustworthiness of scientific results. In our evaluation in marine science, we collaborate with the GEOMAR Helmholtz Centre for Ocean Research Kiel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to introduce a diVerent approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. Design—The approach is illustrated by providing quantitative estimates of the short-termeVects of air pollution on mortality in three Spanish cities, Barcelona,Valencia and Vigo, for the period 1992–1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were nonlinear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. Main results—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 μg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 μg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant,especially those among air pollutants and meteorological variables. Conclusions—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This contribution provides a unifying concept for meta-analysis integrating the handling of unobserved heterogeneity, study covariates, publication bias and study quality. It is important to consider these issues simultaneously to avoid the occurrence of artifacts, and a method for doing so is suggested here. METHODS: The approach is based upon the meta-likelihood in combination with a general linear nonparametric mixed model, which lays the ground for all inferential conclusions suggested here. RESULTS: The concept is illustrated at hand of a meta-analysis investigating the relationship of hormone replacement therapy and breast cancer. The phenomenon of interest has been investigated in many studies for a considerable time and different results were reported. In 1992 a meta-analysis by Sillero-Arenas et al. concluded a small, but significant overall effect of 1.06 on the relative risk scale. Using the meta-likelihood approach it is demonstrated here that this meta-analysis is due to considerable unobserved heterogeneity. Furthermore, it is shown that new methods are available to model this heterogeneity successfully. It is argued further to include available study covariates to explain this heterogeneity in the meta-analysis at hand. CONCLUSIONS: The topic of HRT and breast cancer has again very recently become an issue of public debate, when results of a large trial investigating the health effects of hormone replacement therapy were published indicating an increased risk for breast cancer (risk ratio of 1.26). Using an adequate regression model in the previously published meta-analysis an adjusted estimate of effect of 1.14 can be given which is considerably higher than the one published in the meta-analysis of Sillero-Arenas et al. In summary, it is hoped that the method suggested here contributes further to a good meta-analytic practice in public health and clinical disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When people monitor a visual stream of rapidly presented stimuli for two targets (T1 and T2), they often miss T2 if it falls into a time window of about half a second after T1 onset-the attentional blink. However, if T2 immediately follows T1, performance is often reported being as good as that at long lags-the so-called Lag-1 sparing effect. Two experiments investigated the mechanisms underlying this effect. Experiment 1 showed that, at Lag 1, requiring subjects to correctly report both identity and temporal order of targets produces relatively good performance on T2 but relatively bad performance on T1. Experiment 2 confirmed that subjects often confuse target order at short lags, especially if the two targets are equally easy to discriminate. Results suggest that, if two targets appear in close succession, they compete for attentional resources. If the two competitors are of unequal strength the stronger one is more likely to win and be reported at the expense of the other. If the two are equally strong, however, they will often be integrated into the same attentional episode and thus get both access to attentional resources. But this comes with a cost, as it eliminates information about the targets' temporal order.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian Model Averaging (BMA) is used for testing for multiple break points in univariate series using conjugate normal-gamma priors. This approach can test for the number of structural breaks and produce posterior probabilities for a break at each point in time. Results are averaged over specifications including: stationary; stationary around trend and unit root models, each containing different types and number of breaks and different lag lengths. The procedures are used to test for structural breaks on 14 annual macroeconomic series and 11 natural resource price series. The results indicate that there are structural breaks in all of the natural resource series and most of the macroeconomic series. Many of the series had multiple breaks. Our findings regarding the existence of unit roots, having allowed for structural breaks in the data, are largely consistent with previous work.