967 resultados para Research output


Relevância:

30.00% 30.00%

Publicador:

Resumo:

While scientific research and the methodologies involved have gone through substantial technological evolution the technology involved in the publication of the results of these endeavors has remained relatively stagnant. Publication is largely done in the same manner today as it was fifty years ago. Many journals have adopted electronic formats, however, their orientation and style is little different from a printed document. The documents tend to be static and take little advantage of computational resources that might be available. Recent work, Gentleman and Temple Lang (2004), suggests a methodology and basic infrastructure that can be used to publish documents in a substantially different way. Their approach is suitable for the publication of papers whose message relies on computation. Stated quite simply, Gentleman and Temple Lang propose a paradigm where documents are mixtures of code and text. Such documents may be self-contained or they may be a component of a compendium which provides the infrastructure needed to provide access to data and supporting software. These documents, or compendiums, can be processed in a number of different ways. One transformation will be to replace the code with its output -- thereby providing the familiar, but limited, static document. In this paper we apply these concepts to a seminal paper in bioinformatics, namely The Molecular Classification of Cancer, Golub et al. (1999). The authors of that paper have generously provided data and other information that have allowed us to largely reproduce their results. Rather than reproduce this paper exactly we demonstrate that such a reproduction is possible and instead concentrate on demonstrating the usefulness of the compendium concept itself.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Markov chain Monte Carlo is a method of producing a correlated sample in order to estimate features of a complicated target distribution via simple ergodic averages. A fundamental question in MCMC applications is when should the sampling stop? That is, when are the ergodic averages good estimates of the desired quantities? We consider a method that stops the MCMC sampling the first time the width of a confidence interval based on the ergodic averages is less than a user-specified value. Hence calculating Monte Carlo standard errors is a critical step in assessing the output of the simulation. In particular, we consider the regenerative simulation and batch means methods of estimating the variance of the asymptotic normal distribution. We describe sufficient conditions for the strong consistency and asymptotic normality of both methods and investigate their finite sample properties in a variety of examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the open source framework MARVIN for rapid application development in the field of biomedical and clinical research. MARVIN applications consist of modules that can be plugged together in order to provide the functionality required for a specific experimental scenario. Application modules work on a common patient database that is used to store and organize medical data as well as derived data. MARVIN provides a flexible input/output system with support for many file formats including DICOM, various 2D image formats and surface mesh data. Furthermore, it implements an advanced visualization system and interfaces to a wide range of 3D tracking hardware. Since it uses only highly portable libraries, MARVIN applications run on Unix/Linux, Mac OS X and Microsoft Windows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to analyze central motor output changes in relation to contraction force during motor fatigue. The triple stimulation technique (TST, Magistris et al. in Brain 121(Pt 3):437-450, 1998) was used to quantify a central conduction index (CCI = amplitude ratio of central conduction response and peripheral nerve response, obtained simultaneously by the TST). The CCI removes effects of peripheral fatigue from the quantification. It allows a quantification of the percentage of the entire target muscle motor unit pool driven to discharge by a transcranial magnetic stimulus. Subjects (n = 23) performed repetitive maximal voluntary contractions (MVC) of abductor digiti minimi (duration 1 s, frequency 0.5 Hz) during 2 min. TST recordings were obtained every 15 s, using stimulation intensities sufficient to stimulate all cortical motor neurons (MNs) leading to the target muscle, and during voluntary contractions of 20% of the MVC to facilitate the responses. TST was also repetitively recorded during recovery. This basic exercise protocol was modified in a number of experiments to further characterize influences on CCI of motor fatigue (4 min exercise at 50% MVC; delayed fatigue recovery during local hemostasis, "stimulated exercise" by 20 Hz trains of 1 s duration at 0.5 Hz during 2 min). In addition, the cortical silent period was measured during the basic exercise protocol. Force fatigued to approximately 40% of MVC in all experiments and in all subjects. In all subjects, CCI decreased during exercise, but this decrease varied markedly between subjects. On average, CCI reductions preceded force reductions during exercise, and CCI recovery preceded force recovery. Exercising at 50% for 4 min reduced muscle force more markedly than CCI. Hemostasis induced by a cuff delayed muscle force recovery, but not CCI recovery. Stimulated exercise reduced force markedly, but CCI decreased only marginally. Summarized, force reduction and reduction of the CCI related poorly quantitatively and in time, and voluntary drive was particularly critical to reduce the CCI. The fatigue induced reduction of CCI may result from a central inhibitory phenomenon. Voluntary muscle activation is critical for the CCI reduction, suggesting a primarily supraspinal mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Cardiac output (CO) measurement with lithium dilution (COLD) has not been fully validated in sheep using precise ultrasonic flow probe technology (COUFP). Sheep generate important cardiovascular research models and the use of COLD has become more popular in experimental settings. METHODS: Ultrasonic transit-time perivascular flow probes were surgically implanted on the pulmonary artery of 13 sheep. Paired COLD readings were taken at six time points, before and after implantation of a left ventricular assist device (LVAD) and compared with COUFP recorded just after lithium injection. RESULTS: The mean COLD was 5.7 litre min(-1) (range 3.8-9.6 litre min(-1)) and mean COUFP 5.9 litre min(-1) (range 4.0-9.2 litre min(-1)). The bias (standard deviation) was 0.3 (1.0) litre min(-1) [5.1 (16.9)%] and limits of agreement (LOA) were -1.7 to 2.3 litre min(-1) (-28.8 to 39.0%) with a percentage error (PE) of 34.4%. Data to assess trending [rate (95% confidence intervals)] included a 78 (62-93)% concordance rate in the four-quadrant plot (n=27). In the half moon polar plot (n=19), the mean polar angle was +5°, the radial LOA were -49 to +35° and 68 (47-89)% of data points fell within 22.5° of the mean polar angle. Both tests indicated moderate to poor trending ability. CONCLUSION: COLD is not precise when evaluated against COUFP in sheep based on the statistical criteria set, but the results are comparable with previously published animal studies. KEYWORDS:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A small, but growing, body of literature searches for evidence of non-Keynesian effects of fiscal contractions. That is, some evidence exists that large fiscal contractions stimulate short-run economic activity. Our paper continues this research effort by systematically examining the effects, if any, of unusual fiscal events - either non-Keynesian results within a Keynesian model or Keynesian results within a neoclassical model -- on short-run economic activity. We examine this issue within three separate models -- a St. Louis equation, a Hall-type consumption equation, and a growth accounting equation. Our empirical findings are mixed, and do not provide strong systematic support for the view that unusually large fiscal contractions/expansions reverse the effects of normal fiscal events. Moreover, we find only limited evidence that trigger points are empirically important.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The present study offers a novel methodological contribution to the study of the configuration and dynamics of research groups, through a comparative perspective of the projects funded (inputs) and publication co-authorships (output). Method: A combination of bibliometric techniques and social network analysis was applied to a case study: the Departmento de Bibliotecología (DHUBI), Universidad Nacional de La Plata, Argentina, for the period 2000-2009. The results were interpreted statistically and staff members of the department, were interviewed. Results: The method makes it possible to distinguish groups, identify their members and reflect group make-up through an analytical strategy that involves the categorization of actors and the interdisciplinary and national or international projection of the networks that they configure. The integration of these two aspects (input and output) at different points in time over the analyzed period leads to inferences about group profiles and the roles of actors. Conclusions: The methodology presented is conducive to micro-level interpretations in a given area of study, regarding individual researchers or research groups. Because the comparative input-output analysis broadens the base of information and makes it possible to follow up, over time, individual and group trends, it may prove very useful for the management, promotion and evaluation of science

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The present study offers a novel methodological contribution to the study of the configuration and dynamics of research groups, through a comparative perspective of the projects funded (inputs) and publication co-authorships (output). Method: A combination of bibliometric techniques and social network analysis was applied to a case study: the Departmento de Bibliotecología (DHUBI), Universidad Nacional de La Plata, Argentina, for the period 2000-2009. The results were interpreted statistically and staff members of the department, were interviewed. Results: The method makes it possible to distinguish groups, identify their members and reflect group make-up through an analytical strategy that involves the categorization of actors and the interdisciplinary and national or international projection of the networks that they configure. The integration of these two aspects (input and output) at different points in time over the analyzed period leads to inferences about group profiles and the roles of actors. Conclusions: The methodology presented is conducive to micro-level interpretations in a given area of study, regarding individual researchers or research groups. Because the comparative input-output analysis broadens the base of information and makes it possible to follow up, over time, individual and group trends, it may prove very useful for the management, promotion and evaluation of science

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The present study offers a novel methodological contribution to the study of the configuration and dynamics of research groups, through a comparative perspective of the projects funded (inputs) and publication co-authorships (output). Method: A combination of bibliometric techniques and social network analysis was applied to a case study: the Departmento de Bibliotecología (DHUBI), Universidad Nacional de La Plata, Argentina, for the period 2000-2009. The results were interpreted statistically and staff members of the department, were interviewed. Results: The method makes it possible to distinguish groups, identify their members and reflect group make-up through an analytical strategy that involves the categorization of actors and the interdisciplinary and national or international projection of the networks that they configure. The integration of these two aspects (input and output) at different points in time over the analyzed period leads to inferences about group profiles and the roles of actors. Conclusions: The methodology presented is conducive to micro-level interpretations in a given area of study, regarding individual researchers or research groups. Because the comparative input-output analysis broadens the base of information and makes it possible to follow up, over time, individual and group trends, it may prove very useful for the management, promotion and evaluation of science

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A nested ice flow model was developed for eastern Dronning Maud Land to assist with the dating and interpretation of the EDML deep ice core. The model consists of a high-resolution higher-order ice dynamic flow model that was nested into a comprehensive 3-D thermomechanical model of the whole Antarctic ice sheet. As the drill site is on a flank position the calculations specifically take into account the effects of horizontal advection as deeper ice in the core originated from higher inland. First the regional velocity field and ice sheet geometry is obtained from a forward experiment over the last 8 glacial cycles. The result is subsequently employed in a Lagrangian backtracing algorithm to provide particle paths back to their time and place of deposition. The procedure directly yields the depth-age distribution, surface conditions at particle origin, and a suite of relevant parameters such as initial annual layer thickness. This paper discusses the method and the main results of the experiment, including the ice core chronology, the non-climatic corrections needed to extract the climatic part of the signal, and the thinning function. The focus is on the upper 89% of the ice core (appr. 170 kyears) as the dating below that is increasingly less robust owing to the unknown value of the geothermal heat flux. It is found that the temperature biases resulting from variations of surface elevation are up to half of the magnitude of the climatic changes themselves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2005, the International Ocean Colour Coordinating Group (IOCCG) convened a working group to examine the state of the art in ocean colour data merging, which showed that the research techniques had matured sufficiently for creating long multi-sensor datasets (IOCCG, 2007). As a result, ESA initiated and funded the DUE GlobColour project (http://www.globcolour.info/) to develop a satellite based ocean colour data set to support global carbon-cycle research. It aims to satisfy the scientific requirement for a long (10+ year) time-series of consistently calibrated global ocean colour information with the best possible spatial coverage. This has been achieved by merging data from the three most capable sensors: SeaWiFS on GeoEye's Orbview-2 mission, MODIS on NASA's Aqua mission and MERIS on ESA's ENVISAT mission. In setting up the GlobColour project, three user organisations were invited to help. Their roles are to specify the detailed user requirements, act as a channel to the broader end user community and to provide feedback and assessment of the results. The International Ocean Carbon Coordination Project (IOCCP) based at UNESCO in Paris provides direct access to the carbon cycle modelling community's requirements and to the modellers themselves who will use the final products. The UK Met Office's National Centre for Ocean Forecasting (NCOF) in Exeter, UK, provides an understanding of the requirements of oceanography users, and the IOCCG bring their understanding of the global user needs and valuable advice on best practice within the ocean colour science community. The three year project kicked-off in November 2005 under the leadership of ACRI-ST (France). The first year was a feasibility demonstration phase that was successfully concluded at a user consultation workshop organised by the Laboratoire d'Océanographie de Villefranche, France, in December 2006. Error statistics and inter-sensor biases were quantified by comparison with insitu measurements from moored optical buoys and ship based campaigns, and used as an input to the merging. The second year was dedicated to the production of the time series. In total, more than 25 Tb of input (level 2) data have been ingested and 14 Tb of intermediate and output products created, with 4 Tb of data distributed to the user community. Quality control (QC) is provided through the Diagnostic Data Sets (DDS), which are extracted sub-areas covering locations of in-situ data collection or interesting oceanographic phenomena. This Full Product Set (FPS) covers global daily merged ocean colour products in the time period 1997-2006 and is also freely available for use by the worldwide science community at http://www.globcolour.info/data_access_full_prod_set.html. The GlobColour service distributes global daily, 8-day and monthly data sets at 4.6 km resolution for, chlorophyll-a concentration, normalised water-leaving radiances (412, 443, 490, 510, 531, 555 and 620 nm, 670, 681 and 709 nm), diffuse attenuation coefficient, coloured dissolved and detrital organic materials, total suspended matter or particulate backscattering coefficient, turbidity index, cloud fraction and quality indicators. Error statistics from the initial sensor characterisation are used as an input to the merging methods and propagate through the merging process to provide error estimates for the output merged products. These error estimates are a key component of GlobColour as they are invaluable to the users; particularly the modellers who need them in order to assimilate the ocean colour data into ocean simulations. An intensive phase of validation has been undertaken to assess the quality of the data set. In addition, inter-comparisons between the different merged datasets will help in further refining the techniques used. Both the final products and the quality assessment were presented at a second user consultation in Oslo on 20-22 November 2007 organised by the Norwegian Institute for Water Research (NIVA); presentations are available on the GlobColour WWW site. On request of the ESA Technical Officer for the GlobColour project, the FPS data set was mirrored in the PANGAEA data library.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The BSRN Toolbox is a software package supplied by the WRMC and is freely available to all station scientists and data users. The main features of the package include a download manager for Station- to-Archive files, a tool to convert files into human readable TAB-separated ASCII-tables (similar to those output by the PANGAEA database), and a tool to check data sets for violations of the "BSRN Global Network recommended QC tests, V2.0" quality criteria. The latter tool creates quality codes, one per measured value, indicating if the data are "physically possible," "extremely rare," or if "intercomparison limits are exceeded." In addition, auxiliary data such as solar zenith angle or global calculated from diffuse and direct can be output. All output from the QC tool can be visualized using PanPlot (doi:10.1594/PANGAEA.816201).