64 resultados para Lubrication and cooling techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emissions of exhaust gases and particles from oceangoing ships are a significant and growing contributor to the total emissions from the transportation sector. We present an assessment of the contribution of gaseous and particulate emissions from oceangoing shipping to anthropogenic emissions and air quality. We also assess the degradation in human health and climate change created by these emissions. Regulating ship emissions requires comprehensive knowledge of current fuel consumption and emissions, understanding of their impact on atmospheric composition and climate, and projections of potential future evolutions and mitigation options. Nearly 70% of ship emissions occur within 400 km of coastlines, causing air quality problems through the formation of ground-level ozone, sulphur emissions and particulate matter in coastal areas and harbours with heavy traffic. Furthermore, ozone and aerosol precursor emissions as well as their derivative species from ships may be transported in the atmosphere over several hundreds of kilometres, and thus contribute to air quality problems further inland, even though they are emitted at sea. In addition, ship emissions impact climate. Recent studies indicate that the cooling due to altered clouds far outweighs the warming effects from greenhouse gases such as carbon dioxide (CO2) or ozone from shipping, overall causing a negative present-day radiative forcing (RF). Current efforts to reduce sulphur and other pollutants from shipping may modify this. However, given the short residence time of sulphate compared to CO2, the climate response from sulphate is of the order decades while that of CO2 is centuries. The climatic trade-off between positive and negative radiative forcing is still a topic of scientific research, but from what is currently known, a simple cancellation of global mean forcing components is potentially inappropriate and a more comprehensive assessment metric is required. The CO2 equivalent emissions using the global temperature change potential (GTP) metric indicate that after 50 years the net global mean effect of current emissions is close to zero through cancellation of warming by CO2 and cooling by sulphate and nitrogen oxides.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is becoming increasingly important to be able to verify the spatial accuracy of precipitation forecasts, especially with the advent of high-resolution numerical weather prediction (NWP) models. In this article, the fractions skill score (FSS) approach has been used to perform a scale-selective evaluation of precipitation forecasts during 2003 from the Met Office mesoscale model (12 km grid length). The investigation shows how skill varies with spatial scale, the scales over which the data assimilation (DA) adds most skill, and how the loss of that skill is dependent on both the spatial scale and the rainfall coverage being examined. Although these results come from a specific model, they demonstrate how this verification approach can provide a quantitative assessment of the spatial behaviour of new finer-resolution models and DA techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines utopian gestures and inaugural desires in two films which became symbolic of the Brazilian Film Revival in the late 1990s: Central Station (1998) and Midnight (1999). Both evolve around the idea of an overcrowded or empty centre in a country trapped between past and future, in which the motif of the zero stands for both the announcement and the negation of utopia. The analysis draws parallels between them and new wave films which also elaborate on the idea of the zero, with examples picked from Italian neo-realism, the Brazilian Cinema Novo and the New German Cinema. In Central Station, the ‘point zero’, or the core of the homeland, is retrieved in the archaic backlands, where political issues are resolved in the private sphere and the social drama turns into family melodrama. Midnight, in its turn, recycles Glauber Rocha’s utopian prophecies in the new millennium’s hour zero, when the earthly paradise represented by the sea is re-encountered by the middle-class character, but not by the poor migrant. In both cases, public injustice is compensated by the heroes’ personal achievements, but those do not refer to the real nation, its history or society. Their utopian breadth, based on nostalgia, citation and genre techniques, is of a virtual kind, attune to cinema only.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern neurostimulation approaches in humans provide controlled inputs into the operations of cortical regions, with highly specific behavioral consequences. This enables causal structure–function inferences, and in combination with neuroimaging, has provided novel insights into the basic mechanisms of action of neurostimulation on dis- tributed networks. For example,more recent work has established the capacity of transcranialmagnetic stimulation (TMS) to probe causal interregional influences, and their interaction with cognitive state changes. Combinations of neurostimulation and neuroimaging now face the challenge of integrating the known physiological effects of neu- rostimulationwith theoretical and biologicalmodels of cognition, for example,when theoretical stalemates between opposing cognitive theories need to be resolved. This will be driven by novel developments, including biologically informedcomputational network analyses for predicting the impactofneurostimulationonbrainnetworks, as well as novel neuroimaging and neurostimulation techniques. Such future developments may offer an expanded set of tools withwhich to investigate structure–function relationships, and to formulate and reconceptualize testable hypotheses about complex neural network interactions and their causal roles in cognition

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactions between different convection modes can be investigated using an energy–cycle description under a framework of mass–flux parameterization. The present paper systematically investigates this system by taking a limit of two modes: shallow and deep convection. Shallow convection destabilizes itself as well as the other convective modes by moistening and cooling the environment, whereas deep convection stabilizes itself as well as the other modes by drying and warming the environment. As a result, shallow convection leads to a runaway growth process in its stand–alone mode, whereas deep convection simply damps out. Interaction between these two convective modes becomes a rich problem, even when it is limited to the case with no large–scale forcing, because of these opposing tendencies. Only if the two modes are coupled at a proper level can a self–sustaining system arise, exhibiting a periodic cycle. The present study establishes the conditions for self–sustaining periodic solutions. It carefully documents the behaviour of the two mode system in order to facilitate the interpretation of global model behaviours when this energy–cycle is implemented as a closure into a convection parameterization in future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coupled climate dynamics underlying large, rapid, and potentially irreversible changes in ice cover are studied. A global atmosphere–ocean–sea ice general circulation model with idealized aquaplanet geometry is forced by gradual multi-millennial variations in solar luminosity. The model traverses a hysteresis loop between warm ice-free conditions and cold glacial conditions in response to ±5 W m−2 variations in global, annual-mean insolation. Comparison of several model configurations confirms the importance of polar ocean processes in setting the sensitivity and time scales of the transitions. A “sawtooth” character is found with faster warming and slower cooling, reflecting the opposing effects of surface heating and cooling on upper-ocean buoyancy and, thus, effective heat capacity. The transition from a glacial to warm, equable climate occurs in about 200 years. In contrast to the “freshwater hosing” scenario, transitions are driven by radiative forcing and sea ice feedbacks. The ocean circulation, and notably the meridional overturning circulation (MOC), does not drive the climate change. The MOC (and associated heat transport) collapses poleward of the advancing ice edge, but this is a purely passive response to cooling and ice expansion. The MOC does, however, play a key role in setting the time scales of the transition and contributes to the asymmetry between warming and cooling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many important drugs in the Chinese materia medica (CMM) are known to be toxic, and it has long been recognized in classical Chinese medical theory that toxicity can arise directly from the components of a single CMM or may be induced by an interaction between combined CMM. Traditional Chinese Medicine presents a unique set of pharmaceutical theories that include particular methods for processing, combining and decocting, and these techniques contribute to reducing toxicity as well as enhancing efficacy. The current classification of toxic CMM drugs, traditional methods for processing toxic CMM and the prohibited use of certain combinations, is based on traditional experience and ancient texts and monographs, but accumulating evidence increasingly supports their use to eliminate or reduce toxicity. Modern methods are now being used to evaluate the safety of CMM; however, a new system for describing the toxicity of Chinese herbal medicines may need to be established to take into account those herbs whose toxicity is delayed or otherwise hidden, and which have not been incorporated into the traditional classification. This review explains the existing classification and justifies it where appropriate, using experimental results often originally published in Chinese and previously not available outside China.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An initial validation of the Along Track Scanning Radiometer (ATSR) Reprocessing for Climate (ARC) retrievals of sea surface temperature (SST) is presented. ATSR-2 and Advanced ATSR (AATSR) SST estimates are compared to drifting buoy and moored buoy observations over the period 1995 to 2008. The primary ATSR estimates are of skin SST, whereas buoys measure SST below the surface. Adjustment is therefore made for the skin effect, for diurnal stratification and for differences in buoy–satellite observation time. With such adjustments, satellite-in situ differences are consistent between day and night within ~ 0.01 K. Satellite-in situ differences are correlated with differences in observation time, because of the diurnal warming and cooling of the ocean. The data are used to verify the average behaviour of physical and empirical models of the warming/cooling rates. Systematic differences between adjusted AATSR and in-situ SSTs against latitude, total column water vapour (TCWV), and wind speed are less than 0.1 K, for all except the most extreme cases (TCWV < 5 kg m–2, TCWV > 60 kg m–2). For all types of retrieval except the nadir-only two-channel (N2), regional biases are less than 0.1 K for 80% of the ocean. Global comparison against drifting buoys shows night time dual-view two-channel (D2) SSTs are warm by 0.06 ± 0.23 K and dual-view three-channel (D3) SSTs are warm by 0.06 ± 0.21 K (day-time D2: 0.07 ± 0.23 K). Nadir-only results are N2: 0.03 ± 0.33 K and N3: 0.03 ± 0.19 K showing the improved inter-algorithm consistency to ~ 0.02 K. This represents a marked improvement from the existing operational retrieval algorithms for which inter-algorithm inconsistency is > 0.5 K. Comparison against tropical moored buoys, which are more accurate than drifting buoys, gives lower error estimates (N3: 0.02 ± 0.13 K, D2: 0.03 ± 0.18 K). Comparable results are obtained for ATSR-2, except that the ATSR-2 SSTs are around 0.1 K warm compared to AATSR

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Future climate change projections are often derived from ensembles of simulations from multiple global circulation models using heuristic weighting schemes. This study provides a more rigorous justification for this by introducing a nested family of three simple analysis of variance frameworks. Statistical frameworks are essential in order to quantify the uncertainty associated with the estimate of the mean climate change response. The most general framework yields the “one model, one vote” weighting scheme often used in climate projection. However, a simpler additive framework is found to be preferable when the climate change response is not strongly model dependent. In such situations, the weighted multimodel mean may be interpreted as an estimate of the actual climate response, even in the presence of shared model biases. Statistical significance tests are derived to choose the most appropriate framework for specific multimodel ensemble data. The framework assumptions are explicit and can be checked using simple tests and graphical techniques. The frameworks can be used to test for evidence of nonzero climate response and to construct confidence intervals for the size of the response. The methodology is illustrated by application to North Atlantic storm track data from the Coupled Model Intercomparison Project phase 5 (CMIP5) multimodel ensemble. Despite large variations in the historical storm tracks, the cyclone frequency climate change response is not found to be model dependent over most of the region. This gives high confidence in the response estimates. Statistically significant decreases in cyclone frequency are found on the flanks of the North Atlantic storm track and in the Mediterranean basin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sea ice friction models are necessary to predict the nature of interactions between sea ice floes. These interactions are of interest on a range of scales, for example, to predict loads on engineering structures in icy waters or to understand the basin-scale motion of sea ice. Many models use Amonton's friction law due to its simplicity. More advanced models allow for hydrodynamic lubrication and refreezing of asperities; however, modeling these processes leads to greatly increased complexity. In this paper we propose, by analogy with rock physics, that a rate- and state-dependent friction law allows us to incorporate memory (and thus the effects of lubrication and bonding) into ice friction models without a great increase in complexity. We support this proposal with experimental data on both the laboratory (∼0.1 m) and ice tank (∼1 m) scale. These experiments show that the effects of static contact under normal load can be incorporated into a friction model. We find the parameters for a first-order rate and state model to be A = 0.310, B = 0.382, and μ0 = 0.872. Such a model then allows us to make predictions about the nature of memory effects in moving ice-ice contacts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Market failure can be corrected using different regulatory approaches ranging from high to low intervention. Recently, classic regulations have been criticized as costly and economically irrational and thus policy makers are giving more consideration to soft regulatory techniques such as information remedies. However, despite the plethora of food information conveyed by different media there appears to be a lack of studies exploring how consumers evaluate this information and how trust towards publishers influence their choices for food information. In order to fill such a gap, this study investigates questions related to topics which are more relevant to consumers, who should disseminate trustful food information, and how communication should be conveyed and segmented. Primary data were collected both through qualitative (in depth interviews and focus groups) and quantitative research (web and mail surveys). Attitudes, willingness to pay for food information and trust towards public and private sources conveying information through a new food magazine were assessed using both multivariate statistical methods and econometric analysis. The study shows that consumer attitudes towards food information topics can be summarized along three cognitive-affective dimensions: the agro-food system, enjoyment and wellness. Information related to health risks caused by nutritional disorders and food safety issues caused by bacteria and chemical substances is the most important for about 90% of respondents. Food information related to regulations and traditions is also considered important for more than two thirds of respondents, while information about food production and processing techniques, life style and food fads are considered less important by the majority of respondents. Trust towards food information disseminated by public bodies is higher than that observed for private bodies. This behavior directly affects willingness to pay (WTP) for food information provided by public and private publishers when markets are shocked by a food safety incident. WTP for consumer association (€ 1.80) and the European Food Safety Authority (€ 1.30) are higher than WTP for the independent and food industry publishers which cluster around zero euro. Furthermore, trust towards the type of publisher also plays a key role in food information market segmentation together with socio-demographic and economic variables such as gender, age, presence of children and income. These findings invite policy makers to reflect on the possibility of using information remedies conveyed using trusted sources of information to specific segments of consumers as an interesting soft alternative to the classic way of regulating modern food markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper identifies characteristics of knowledge intensive processes and a method to improve their performance based on analysis of investment banking front office processes. The inability to improve these processes using standard process improvement techniques confirmed that much of the process was not codified and depended on tacit knowledge and skills. This led to the use of a semi-structured analysis of the characteristics of the processes via a questionnaire to identify knowledge intensive processes characteristics that adds to existing theory. Further work identified innovative process analysis and change techniques that could generate improvements based on an analysis of their properties and the issue drivers. An improvement methodology was developed to harness a number of techniques that were found to effective in resolving the issue drivers and improving these knowledge intensive processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A combination of structural, physical and computational techniques including powder X-ray and neutron diffraction, SQUID magnetometry, electrical and thermal transport measurements, DFT calculations and 119Sn Mössbauer and X-ray photoelec-tron spectroscopies has been applied to Co3Sn2-xInxS2 (0 ≤ x ≤ 2) in an effort to understand the relationship between metal-atom ordering and physical properties as the Fermi level is systematically varied. Whilst solid solution behavior is found throughout the composition region, powder neutron diffraction reveals that indium preferentially occupies an inter-layer site over an alternative kagome-like intra-layer site. DFT calculations indicate that this ordering, which leads to a lowering of energy, is related to the dif-fering bonding properties of tin and indium. Spectroscopic data suggest that throughout the composition range 0 ≤ x ≤ 2, all ele-ments adopt oxidation states that are significantly reduced from expectations based on formal charges. Chemical substitution ena-bles the electrical transport properties to be controlled through tuning of the Fermi level within a region of the density of states, which comprises narrow bands of predominantly Co d-character. This leads to a compositionally-induced double metal-to-semiconductor-to-metal transition. The marked increase in the Seebeck coefficient as the semiconducting region is approached leads to a substantial improvement in the thermoelectric figure of merit, ZT, which exhibits a maximum of ZT = 0.32 at 673 K. At 425 K, the figure of merit for phases in the region 0.8 ≤ x ≤ 0.85 is amongst the highest reported for sulphide phases, suggesting these materials may have applications in low-grade waste heat recovery.