942 resultados para Scientific workflow
Resumo:
To investigate the effects of the middle atmosphere on climate, the World Climate Research Programme is supporting the project "Stratospheric Processes and their Role in Climate" (SPARC). A central theme of SPARC, to examine model simulations of the coupled troposphere—middle atmosphere system, is being performed through the initiative called GRIPS (GCM—Reality Intercomparison Project for SPARC). In this paper, an overview of the objectives of GRIPS is given. Initial activities include an assessment of the performance of middle atmosphere climate models, and preliminary results from this evaluation are presented here. It is shown that although all 13 models evaluated represent most major features of the mean atmospheric state, there are deficiencies in the magnitude and location of the features, which cannot easily be traced to the formulation (resolution or the parameterizations included) of the models. Most models show a cold bias in all locations, apart from the tropical tropopause region where they can be either too warm or too cold. The strengths and locations of the major jets are often misrepresented in the models. Looking at three—dimensional fields reveals, for some models, more severe deficiencies in the magnitude and positioning of the dominant structures (such as the Aleutian high in the stratosphere), although undersampling might explain some of these differences from observations. All the models have shortcomings in their simulations of the present—day climate, which might limit the accuracy of predictions of the climate response to ozone change and other anomalous forcing.
Resumo:
Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.
Resumo:
As the Enlightenment drew to a close, translation had gradually acquired an increasingly important role in the international circulation and transmission of scientific knowledge. Yet comparatively little attention has been paid to the translators responsible for making such accounts accessible in other languages, some of whom were women. In this article I explore how European women cast themselves as intellectually enquiring, knowledgeable and authoritative figures in their translations. Focusing specifically on the genre of scientific travel writing, I investigate the narrative strategies deployed by women translators to mark their involvement in the process of scientific knowledge-making. These strategies ranged from rhetorical near-invisibility, driven by women's modest marginalization of their own public engagement in science, to the active advertisement of themselves as intellectually curious consumers of scientific knowledge. A detailed study of Elizabeth Helme's translation of the French ornithologist Françoise le Vaillant's Voyage dans l'intérieur de l'Afrique [Voyage into the Interior of Africa] (1790) allows me to explore how her reworking of the original text for an Anglophone reading public enabled her to engage cautiously – or sometimes more openly – with questions regarding how scientific knowledge was constructed, for whom and with which aims in mind.
Resumo:
The purpose of this study was to develop an understanding of the current state of scientific data sharing that stakeholders could use to develop and implement effective data sharing strategies and policies. The study developed a conceptual model to describe the process of data sharing, and the drivers, barriers, and enablers that determine stakeholder engagement. The conceptual model was used as a framework to structure discussions and interviews with key members of all stakeholder groups. Analysis of data obtained from interviewees identified a number of themes that highlight key requirements for the development of a mature data sharing culture.
Resumo:
SOA (Service Oriented Architecture), workflow, the Semantic Web, and Grid computing are key enabling information technologies in the development of increasingly sophisticated e-Science infrastructures and application platforms. While the emergence of Cloud computing as a new computing paradigm has provided new directions and opportunities for e-Science infrastructure development, it also presents some challenges. Scientific research is increasingly finding that it is difficult to handle “big data” using traditional data processing techniques. Such challenges demonstrate the need for a comprehensive analysis on using the above mentioned informatics techniques to develop appropriate e-Science infrastructure and platforms in the context of Cloud computing. This survey paper describes recent research advances in applying informatics techniques to facilitate scientific research particularly from the Cloud computing perspective. Our particular contributions include identifying associated research challenges and opportunities, presenting lessons learned, and describing our future vision for applying Cloud computing to e-Science. We believe our research findings can help indicate the future trend of e-Science, and can inform funding and research directions in how to more appropriately employ computing technologies in scientific research. We point out the open research issues hoping to spark new development and innovation in the e-Science field.
Resumo:
Users’ requirements change drives an information system evolution. Consequently, such evolution affects those atomic services which provide functional operations from one state of their composition to another state of composition. A challenging issue associated with such evolution of the state of service composition is to ensure a resultant service composition remaining rational. This paper presents a method of Service Composition Atomic-Operation Set (SCAOS). SCAOS defines 2 classes of atomic operations and 13 kinds of basic service compositions to aid a state change process by using Workflow Net. The workflow net has algorithmic capabilities to compose the required services with rationality and maintain any changes to the services in a different composition also rational. This method can improve the adaptability to the ever changing business requirements of information systems in the dynamic environment.
Resumo:
The typographical naivety of much scientific legibility research has caused designers to question the value of the research and the results. Examining the reasons underlying this questioning, the paper discusses the importance of designers being more accepting of scientific findings, and why legibility investigations have value. To demonstrate how typographic knowledge can be incorporated into the design of studies to increase their validity, the paper reports on a new investigation into the role of serifs when viewed at a distance. The experiment looks into the identification of the lowercase letters ‘j’, ‘i’, ‘l’, ‘b’, ‘h’, ‘n’, ‘u’, and ‘a’ in isolation. All of the letters originate in the same typeface and are presented in one version with serifs and one version without serifs. Although the experiment found no overall legibility difference between the sans serif and the serif versions, the study showed that letters with serifs placed on the vertical extremes were more legible at a distance than the same letters in a sans serif. These findings can therefore provide specific guidance on the design of individual letters and demonstrate the product of collaboration between designer and scientist on the planning, implementation, and analysis of the study.
Resumo:
The automatic transformation of sequential programs for efficient execution on parallel computers involves a number of analyses and restructurings of the input. Some of these analyses are based on computing array sections, a compact description of a range of array elements. Array sections describe the set of array elements that are either read or written by program statements. These sections can be compactly represented using shape descriptors such as regular sections, simple sections, or generalized convex regions. However, binary operations such as Union performed on these representations do not satisfy a straightforward closure property, e.g., if the operands to Union are convex, the result may be nonconvex. Approximations are resorted to in order to satisfy this closure property. These approximations introduce imprecision in the analyses and, furthermore, the imprecisions resulting from successive operations have a cumulative effect. Delayed merging is a technique suggested and used in some of the existing analyses to minimize the effects of approximation. However, this technique does not guarantee an exact solution in a general setting. This article presents a generalized technique to precisely compute Union which can overcome these imprecisions.
Resumo:
Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.
Resumo:
An expert panel was convened in October 2013 by the International Scientific Association for Probiotics and Prebiotics (ISAPP) to discuss the field of probiotics. It is now 13 years since the definition of probiotics and 12 years after guidelines were published for regulators, scientists and industry by the Food and Agriculture Organization of the United Nations and the WHO (FAO/WHO). The FAO/WHO definition of a probiotic—“live microorganisms which when administered in adequate amounts confer a health benefit on the host”—was reinforced as relevant and sufficiently accommodating for current and anticipated applications. However, inconsistencies between the FAO/WHO Expert Consultation Report and the FAO/WHO Guidelines were clarified to take into account advances in science and applications. A more precise use of the term 'probiotic' will be useful to guide clinicians and consumers in differentiating the diverse products on the market. This document represents the conclusions of the ISAPP consensus meeting on the appropriate use and scope of the term probiotic.