66 resultados para Lower and upper solutions
Resumo:
Ethnopharmacological relevance: Studies on traditional Chinese medicine (TCM), like those of other systems of traditional medicine (TM), are very variable in their quality, content and focus, resulting in issues around their acceptability to the global scientific community. In an attempt to address these issues, an European Union funded FP7 consortium, composed of both Chinese and European scientists and named “Good practice in traditional Chinese medicine” (GP-TCM), has devised a series of guidelines and technical notes to facilitate good practice in collecting, assessing and publishing TCM literature as well as highlighting the scope of information that should be in future publications on TMs. This paper summarises these guidelines, together with what has been learned through GP-TCM collaborations, focusing on some common problems and proposing solutions. The recommendations also provide a template for the evaluation of other types of traditional medicine such as Ayurveda, Kampo and Unani. Materials and methods: GP-TCM provided a means by which experts in different areas relating to TCM were able to collaborate in forming a literature review good practice panel which operated through e-mail exchanges, teleconferences and focused discussions at annual meetings. The panel involved coordinators and representatives of each GP-TCM work package (WP) with the latter managing the testing and refining of such guidelines within the context of their respective WPs and providing feedback. Results: A Good Practice Handbook for Scientific Publications on TCM was drafted during the three years of the consortium, showing the value of such networks. A “deliverable – central questions – labour division” model had been established to guide the literature evaluation studies of each WP. The model investigated various scoring systems and their ability to provide consistent and reliable semi-quantitative assessments of the literature, notably in respect of the botanical ingredients involved and the scientific quality of the work described. This resulted in the compilation of (i) a robust scoring system and (ii) a set of minimum standards for publishing in the herbal medicines field, based on an analysis of the main problems identified in published TCM literature.
Resumo:
There is general agreement across the world that human-made climate change is a serious global problem,although there are still some sceptics who challenge this view. Research in organization studies on the topic is relatively new. Much of this research, however, is instrumental and managerialist in its focus on ‘win-win’ opportunities for business or its treatment of climate change as just another corporate social responsibility (CSR) exercise. In this paper, we suggest that climate change is not just an environmental problem requiring technical and managerial solutions; it is a political issue where a variety of organizations – state agencies, firms, industry associations, NGOs and multilateral organizations – engage in contestation as well as collaboration over the issue. We discuss the strategic, institutional and political economy dimensions of climate change and develop a socioeconomic regimes approach as a synthesis of these different theoretical perspectives. Given the urgency of the problem and the need for a rapid transition to a low-carbon economy, there is a pressing need for organization scholars to develop a better understanding of apathy and inertia in the face of the current crisis and to identify paths toward transformative change. The seven papers in this special issue address these areas of research and examine strategies, discourses, identities and practices in relation to climate change at multiple levels.
Resumo:
A cross-sectional analysis of ethnic differences in dietary intake, insulin sensitivity and beta-cell function, using the intravenous glucose tolerance test (IVGTT), was conducted on 497 healthy adult participants of the ‘Reading, Imperial, Surrey, Cambridge, and Kings’ (RISCK) study. Insulin sensitivity (Si) was significantly lower in African-Caribbean (AC) and South Asian (SA) participants [IVGTT-Si; AC: 2.13 vs SA: 2.25 vs white-European (WE): 2.84 (×10−4 mL µU min)2, p < 0.001]. AC participants had a higher prevalence of anti-hypertensive therapy (AC: 19.7% vs SA: 7.5%), the most cardioprotective lipid profile [total:high-density lipoprotein (HDL); AC: 3.52 vs SA: 4.08 vs WE: 3.83, p = 0.03] and more pronounced hyperinsulinaemia [IVGTT–acute insulin response (AIR)] [AC: 575 vs SA: 428 vs WE: 344 mL/µU/min)2, p = 0.002], specifically in female participants. Intake of saturated fat and carbohydrate was lower and higher in AC (10.9% and 50.4%) and SA (11.1% and 52.3%), respectively, compared to WE (13.6% and 43.8%, p < 0.001). Insulin resistance in ACs is characterised by ‘normal’ lipid profiles but high rates of hypertension and pronounced hyperinsulinaemia.
Resumo:
We examine to what degree we can expect to obtain accurate temperature trends for the last two decades near the surface and in the lower troposphere. We compare temperatures obtained from surface observations and radiosondes as well as satellite-based measurements from the Microwave Soundings Units (MSU), which have been adjusted for orbital decay and non-linear instrument-body effects, and reanalyses from the European Centre for Medium-Range Weather Forecasts (ERA) and the National Centre for Environmental Prediction (NCEP). In regions with abundant conventional data coverage, where the MSU has no major influence on the reanalysis, temperature anomalies obtained from microwave sounders, radiosondes and from both reanalyses agree reasonably. Where coverage is insufficient, in particular over the tropical oceans, large differences are found between the MSU and either reanalysis. These differences apparently relate to changes in the satellite data availability and to differing satellite retrieval methodologies, to which both reanalyses are quite sensitive over the oceans. For NCEP, this results from the use of raw radiances directly incorporated into the analysis, which make the reanalysis sensitive to changes in the underlying algorithms, e.g. those introduced in August 1992. For ERA, the bias-correction of the one-dimensional variational analysis may introduce an error when the satellite relative to which the correction is calculated is biased itself or when radiances change on a time scale longer than a couple of months, e.g. due to orbit decay. ERA inhomogeneities are apparent in April 1985, October/November 1986 and April 1989. These dates can be identified with the replacements of satellites. It is possible that a negative bias in the sea surface temperatures (SSTs) used in the reanalyses may have been introduced over the period of the satellite record. This could have resulted from a decrease in the number of ship measurements, a concomitant increase in the importance of satellite-derived SSTs, and a likely cold bias in the latter. Alternately, a warm bias in SSTs could have been caused by an increase in the percentage of buoy measurements (relative to deeper ship intake measurements) in the tropical Pacific. No indications for uncorrected inhomogeneities of land surface temperatures could be found. Near-surface temperatures have biases in the boundary layer in both reanalyses, presumably due to the incorrect treatment of snow cover. The increase of near-surface compared to lower tropospheric temperatures in the last two decades may be due to a combination of several factors, including high-latitude near-surface winter warming due to an enhanced NAO and upper-tropospheric cooling due to stratospheric ozone decrease.
Resumo:
The aim of this work was to investigate the lipopeptides aggregation behavior in single and mixed solutions in a wide range of concentrations, in order to optimize their separation and purification following the two-step ultrafiltration process and using large pore size membranes (up to MWCO = 300 kDa). Micelle size was determined by dynamic light scattering. In single solutions of lipopeptide both surfactin and mycosubtilin formed micelles of different size depending on their concentration, micelles of average diameter = 5–105 nm for surfactin and 8–18 nm for mycosubtilin. However when the lipopeptides were in the same solution they formed mixed micelles of different size (d = 8 nm) and probably conformation to that formed by the individual lipopeptide, this prevents their separation according to size. These lipopeptides were purified from fermentation culture by the two-step ultrafiltration process using different MWCO membranes ranging from 10 to 300 kDa. This led to their effective rejection in the first ultrafiltration step by membranes with MCWO = 10–100 kDa but poor rejection by the 300 KDa membrane. The lipopeptides were recovered at 90% purity (in relation to protein) and with 2.34 enrichment in the permeate of the second ultrafiltration step with the 100 KDa membrane upon addition of 75% ethanol.
Resumo:
Our knowledge of stratospheric O3-N2O correlations is extended, and their potential for model-measurement comparison assessed, using data from the Atmospheric Chemistry Experiment (ACE) satellite and the Canadian Middle Atmosphere Model (CMAM). ACE provides the first comprehensive data set for the investigation of interhemispheric, interseasonal, and height-resolved differences of the O_3-N_2O correlation structure. By subsampling the CMAM data, the representativeness of the ACE data is evaluated. In the middle stratosphere, where the correlations are not compact and therefore mainly reflect the data sampling, joint probability density functions provide a detailed picture of key aspects of transport and mixing, but also trace polar ozone loss. CMAM captures these important features, but exhibits a displacement of the tropical pipe into the Southern Hemisphere (SH). Below about 21 km, the ACE data generally confirm the compactness of the correlations, although chemical ozone loss tends to destroy the compactness during late winter/spring, especially in the SH. This allows a quantitative comparison of the correlation slopes in the lower and lowermost stratosphere (LMS), which exhibit distinct seasonal cycles that reveal the different balances between diabatic descent and horizontal mixing in these two regions in the Northern Hemisphere (NH), reconciling differences found in aircraft measurements, and the strong role of chemical ozone loss in the SH. The seasonal cycles are qualitatively well reproduced by CMAM, although their amplitude is too weak in the NH LMS. The correlation slopes allow a "chemical" definition of the LMS, which is found to vary substantially in vertical extent with season.
Resumo:
Simulations from eleven coupled chemistry-climate models (CCMs) employing nearly identical forcings have been used to project the evolution of stratospheric ozone throughout the 21st century. The model-to-model agreement in projected temperature trends is good, and all CCMs predict continued, global mean cooling of the stratosphere over the next 5 decades, increasing from around 0.25 K/decade at 50 hPa to around 1 K/ decade at 1 hPa under the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) A1B scenario. In general, the simulated ozone evolution is mainly determined by decreases in halogen concentrations and continued cooling of the global stratosphere due to increases in greenhouse gases (GHGs). Column ozone is projected to increase as stratospheric halogen concentrations return to 1980s levels. Because of ozone increases in the middle and upper stratosphere due to GHGinduced cooling, total ozone averaged over midlatitudes, outside the polar regions, and globally, is projected to increase to 1980 values between 2035 and 2050 and before lower stratospheric halogen amounts decrease to 1980 values. In the polar regions the CCMs simulate small temperature trends in the first and second half of the 21st century in midwinter. Differences in stratospheric inorganic chlorine (Cly) among the CCMs are key to diagnosing the intermodel differences in simulated ozone recovery, in particular in the Antarctic. It is found that there are substantial quantitative differences in the simulated Cly, with the October mean Antarctic Cly peak value varying from less than 2 ppb to over 3.5 ppb in the CCMs, and the date at which the Cly returns to 1980 values varying from before 2030 to after 2050. There is a similar variation in the timing of recovery of Antarctic springtime column ozone back to 1980 values. As most models underestimate peak Cly near 2000, ozone recovery in the Antarctic could occur even later, between 2060 and 2070. In the Arctic the column ozone increase in spring does not follow halogen decreases as closely as in the Antarctic, reaching 1980 values before Arctic halogen amounts decrease to 1980 values and before the Antarctic. None of the CCMs predict future large decreases in the Arctic column ozone. By 2100, total column ozone is projected to be substantially above 1980 values in all regions except in the tropics.
Resumo:
During the cold period of the Last Glacial Maximum (LGM, about 21 000 years ago) atmospheric CO2 was around 190 ppm, much lower than the pre-industrial concentration of 280 ppm. The causes of this substantial drop remain partially unresolved, despite intense research. Understanding the origin of reduced atmospheric CO2 during glacial times is crucial to comprehend the evolution of the different carbon reservoirs within the Earth system (atmosphere, terrestrial biosphere and ocean). In this context, the ocean is believed to play a major role as it can store large amounts of carbon, especially in the abyss, which is a carbon reservoir that is thought to have expanded during glacial times. To create this larger reservoir, one possible mechanism is to produce very dense glacial waters, thereby stratifying the deep ocean and reducing the carbon exchange between the deep and upper ocean. The existence of such very dense waters has been inferred in the LGM deep Atlantic from sediment pore water salinity and δ18O inferred temperature. Based on these observations, we study the impact of a brine mechanism on the glacial carbon cycle. This mechanism relies on the formation and rapid sinking of brines, very salty water released during sea ice formation, which brings salty dense water down to the bottom of the ocean. It provides two major features: a direct link from the surface to the deep ocean along with an efficient way of setting a strong stratification. We show with the CLIMBER-2 carbon-climate model that such a brine mechanism can account for a significant decrease in atmospheric CO2 and contribute to the glacial-interglacial change. This mechanism can be amplified by low vertical diffusion resulting from the brine-induced stratification. The modeled glacial distribution of oceanic δ13C as well as the deep ocean salinity are substantially improved and better agree with reconstructions from sediment cores, suggesting that such a mechanism could have played an important role during glacial times.
Resumo:
This study investigates the differential impact that various dimensions of corporate social performance have on the pricing of corporate debt as well as the assessment of the credit quality of specific bond issues. The empirical analysis, based on an extensive longitudinal data set, suggests that overall, good performance is rewarded and corporate social transgressions are penalized through lower and higher corporate bond yield spreads, respectively. Similar conclusions can be drawn when focusing on either the bond rating assigned to a specific debt issue or the probability of it being considered to be an asset of speculative grade.
Resumo:
Recent experimental evidence suggests a finer genetic, structural and functional subdivision of the layers which form a cortical column. The classical layer II/III (LII/III) of rodent neocortex integrates ascending sensory information with contextual cortical information for behavioral read-out. We systematically investigated to which extent regular-spiking supragranular pyramidal neurons, located at different depths within the cortex, show different input-output connectivity patterns. Combining glutamate-uncaging with whole-cell recordings and biocytin filling, we revealed a novel cellular organization of LII/III: (i) “Lower LII/III” pyramidal cells receive a very strong excitatory input from lemniscal LIV and much fewer inputs from paralemniscal LVa. They project to all layers of the home column, including a feedback projection to LIV whereas transcolumnar projections are relatively sparse. (ii) “Upper LII/III” pyramidal cells also receive their strongest input from LIV, but in addition, a very strong and dense excitatory input from LVa. They project extensively to LII/III as well as LVa and Vb of their home and neighboring columns, (iii) “Middle LII/III” pyramidal cell show an intermediate connectivity phenotype that stands in many ways in-between the features described for lower versus upper LII/III. “Lower LII/III” intracolumnarly segregates and transcolumnarly integrates lemniscal information whereas “upper LII/III” seems to integrate lemniscal with paralemniscal information. This suggests a finegrained functional subdivision of the supragranular compartment containing multiple circuits without any obvious cytoarchitectonic, other structural or functional correlate of a laminar border in rodent barrel cortex.
Resumo:
Smart meters are becoming more ubiquitous as governments aim to reduce the risks to the energy supply as the world moves toward a low carbon economy. The data they provide could create a wealth of information to better understand customer behaviour. However at the household, and even the low voltage (LV) substation level, energy demand is extremely volatile, irregular and noisy compared to the demand at the high voltage (HV) substation level. Novel analytical methods will be required in order to optimise the use of household level data. In this paper we briefly outline some mathematical techniques which will play a key role in better understanding the customer's behaviour and create solutions for supporting the network at the LV substation level.
Resumo:
Within the SPARC Data Initiative, the first comprehensive assessment of the quality of 13 water vapor products from 11 limb-viewing satellite instruments (LIMS, SAGE II, UARS-MLS, HALOE, POAM III, SMR, SAGE III, MIPAS, SCIAMACHY, ACE-FTS, and Aura-MLS) obtained within the time period 1978-2010 has been performed. Each instrument's water vapor profile measurements were compiled into monthly zonal mean time series on a common latitude-pressure grid. These time series serve as basis for the "climatological" validation approach used within the project. The evaluations include comparisons of monthly or annual zonal mean cross sections and seasonal cycles in the tropical and extratropical upper troposphere and lower stratosphere averaged over one or more years, comparisons of interannual variability, and a study of the time evolution of physical features in water vapor such as the tropical tape recorder and polar vortex dehydration. Our knowledge of the atmospheric mean state in water vapor is best in the lower and middle stratosphere of the tropics and midlatitudes, with a relative uncertainty of. 2-6% (as quantified by the standard deviation of the instruments' multiannual means). The uncertainty increases toward the polar regions (+/- 10-15%), the mesosphere (+/- 15%), and the upper troposphere/lower stratosphere below 100 hPa (+/- 30-50%), where sampling issues add uncertainty due to large gradients and high natural variability in water vapor. The minimum found in multiannual (1998-2008) mean water vapor in the tropical lower stratosphere is 3.5 ppmv (+/- 14%), with slightly larger uncertainties for monthly mean values. The frequently used HALOE water vapor data set shows consistently lower values than most other data sets throughout the atmosphere, with increasing deviations from the multi-instrument mean below 100 hPa in both the tropics and extratropics. The knowledge gained from these comparisons and regarding the quality of the individual data sets in different regions of the atmosphere will help to improve model-measurement comparisons (e.g., for diagnostics such as the tropical tape recorder or seasonal cycles), data merging activities, and studies of climate variability.
Resumo:
In most in vitro studies of oral drug permeability, little attempt is made to reproduce the gastrointestinal lumenal environment. The aim of this study was to evaluate the compatibility of simulated intestinal fluid (SIF) solutions with Caco-2 cell monolayers and Ussing chamber-mounted rat ileum under standard permeability experiment protocols. In preliminary experiments, fasted-state simulated intestinal fluid (FaSSIF) and fed-state simulated intestinal fluid (FeSSIF) solutions based on the dissolution medium formulae of Dressman and co-workers (1998) were modified for compatibility with Caco-2 cells to produce FaS-SIF and FeSSIF "transport" solutions for use with in vitro permeability models. For Caco-2 cells exposed to FaSSIF and FESSIF transport solutions, the transepithelial electrical resistance was maintained for over 4 h and mannitol permeability was equivalent to that in control (Hank's Balanced Salt Solution-treated) cell layers. Scanning electron microscopy revealed that microvilli generally maintained a normal distribution, although some shortening of microvilli and occasional small areas of denudation were observed. For rat ileum in the Ussing chambers, the potential difference (PD) collapsed to zero over 120 min when exposed to the FaSSIF transport solution and an even faster collapse of the PD was observed when the FeSSIF transport solution was used. Electron micrographs revealed erosion of the villi tips and substantial denudation of the microvilli after exposure of ileal tissue to FaSSIF and FeSSIF solutions, and permeability to mannitol was increased by almost two-fold. This study indicated that FaSSIF and FeSSIF transport solutions can be used with Caco-2 monolayers to evaluate drug permeability, but rat ileum in Ussing chambers is adversely affected by these solutions. Metoprolol permeability in Caco-2 experiments was reduced by 33% using the FaSSIF and 75% using the FeSSIF compared to permeability measured using HBSS. This illustrates that using physiological solutions can influence permeability measurements.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
Stratospheric water vapour is a powerful greenhouse gas. The longest available record from balloon observations over Boulder, Colorado, USA shows increases in stratospheric water vapour concentrations that cannot be fully explained by observed changes in the main drivers, tropical tropopause temperatures and methane. Satellite observations could help resolve the issue, but constructing a reliable long-term data record from individual short satellite records is challenging. Here we present an approach to merge satellite data sets with the help of a chemistry–climate model nudged to observed meteorology. We use the models’ water vapour as a transfer function between data sets that overcomes issues arising from instrument drift and short overlap periods. In the lower stratosphere, our water vapour record extends back to 1988 and water vapour concentrations largely follow tropical tropopause temperatures. Lower and mid-stratospheric long-term trends are negative, and the trends from Boulder are shown not to be globally representative. In the upper stratosphere, our record extends back to 1986 and shows positive long-term trends. The altitudinal differences in the trends are explained by methane oxidation together with a strengthened lower-stratospheric and a weakened upper stratospheric circulation inferred by this analysis. Our results call into question previous estimates of surface radiative forcing based on presumed global long-term increases in water vapour concentrations in the lower stratosphere.