944 resultados para Fish silage. subproducts. growing pig. serum parameters
Resumo:
Changing sodium intake from 70-200 mmol/day elevates blood pressure in normotensive volunteers by 6/4 mmHg. Older people, people with reduced renal function on a low sodium diet and people with a family history of hypertension are more likely to show this effect. The rise in blood pressure was associated with a fall in plasma volume suggesting that plasma volume changes do not initiate hypertension. In normotensive individuals the most common abnormality in membrane sodium transport induced by an extra sodium load was an increased permeability of the red cell to sodium. Some normotensive individuals also had an increase in the level of a plasma inhibitor that inhibited Na-K ATPase. These individuals also appeared to have a rise in blood pressure. Sodium intake and blood pressure are related. The relationship differs in different people and is probably controlled by the genetically inherited capacity of systems involved in membrane sodium transport.
Resumo:
The effect of plasma taken from normotensive humans, while on a low and high sodium diet, on [Na + K]-ATPase and 3H-ouabain binding was measured in tubules from guinea-pig kidneys. Plasma from the high sodium, compared to the low sodium, diet period: (a) inhibited [Na + K]-ATPase activity; (b) decreased 3H-ouabain affinity for binding sites; (c) increased the number of available 3H-ouabain binding sites; (d) decreased [Na + K]-ATPase turnover (activity/3H-ouabain binding sites). The inhibition of [Na + K]-ATPase suggests an increase in a (possible) natriuretic factor. The decreased affinity of 3H-ouabain binding suggests an endogenous ouabainoid, which may be the natriuretic factor.
Resumo:
Glacial cycles during the Pleistocene reduced sea levels and created new land connections in northern Australia, where many currently isolated rivers also became connected via an extensive paleo-lake system, 'Lake Carpentaria'. However, the most recent period during which populations of freshwater species were connected by gene flow across Lake Carpentaria is debated: various 'Lake Carpentaria hypotheses' have been proposed. Here, we used a statistical phylogeographic approach to assess the timing of past population connectivity across the Carpentaria region in the obligate freshwater fish, Glossamia aprion. Results for this species indicate that the most recent period of genetic exchange across the Carpentaria region coincided with the mid- to late Pleistocene, a result shown previously for other freshwater and diadromous species. Based on these findings and published studies for various freshwater, diadromous and marine species, we propose a set of 'Lake Carpentaria' hypotheses to explain past population connectivity in aquatic species: (1) strictly freshwater species had widespread gene flow in the mid- to late Pleistocene before the last glacial maximum; (2) marine species were subdivided into eastern and western populations by land during Pleistocene glacial phases; and (3) past connectivity in diadromous species reflects the relative strength of their marine affinity.
Resumo:
A total histological grade does not necessarily distinguish between different manifestations of cartilage damage or degeneration. An accurate and reliable histological assessment method is required to separate normal and pathological tissue within a joint during treatment of degenerative joint conditions and to sub-classify the latter in meaningful ways. The Modified Mankin method may be adaptable for this purpose. We investigated how much detail may be lost by assigning one composite score/grade to represent different degenerative components of the osteoarthritic condition. We used four ovine injury models (sham surgery, anterior cruciate ligament/medial collateral ligament instability, simulated anatomic anterior cruciate ligament reconstruction and meniscal removal) to induce different degrees and potentially 'types' (mechanisms) of osteoarthritis. Articular cartilage was systematically harvested, prepared for histological examination and graded in a blinded fashion using a Modified Mankin grading method. Results showed that the possible permutations of cartilage damage were significant and far more varied than the current intended use that histological grading systems allow. Of 1352 cartilage specimens graded, 234 different manifestations of potential histological damage were observed across 23 potential individual grades of the Modified Mankin grading method. The results presented here show that current composite histological grading may contain additional information that could potentially discern different stages or mechanisms of cartilage damage and degeneration in a sheep model. This approach may be applicable to other grading systems.
Resumo:
We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.
Resumo:
Vertical vegetation is vegetation growing on, or adjacent to, the unused sunlit exterior surfaces of buildings in cities. Vertical vegetation can improve the energy efficiency of the building on which it is installed mainly by insulating, shading and transpiring moisture from foliage and substrate. Several design parameters may affect the extent of the vertical vegetation's improvement of energy performance. Examples are choice of vegetation, growing medium geometry, north/south aspect and others. The purpose of this study is to quantitatively map out the contribution of several parameters to energy savings in a subtropical setting. The method is thermal simulation based on EnergyPlus configured to reflect the special characteristics of vertical vegetation. Thermal simulation results show that yearly cooling energy savings can reach 25% with realistic design choices in subtropical environments. Heating energy savings are negligible. The most important parameter is the aspect of walls covered by vegetation. Vertical vegetation covering walls facing north (south for the northern hemisphere) will result in the highest energy savings. In making plant selections, the most significant parameter is Leaf Area Index (LAI). Plants with larger LAI, preferably LAI>4, contribute to greater savings whereas vertical vegetation with LAI<2 can actually consume energy. The choice of growing media and its thickness influence both heating and cooling energy consumption. Change of growing medium thickness from 6cm to 8cm causes dramatic increase in energy savings from 2% to 18%. For cooling, it is best to use a growing material with high water retention, due to the importance of evapotranspiration for cooling. Similarly, for increased savings in cooling energy, sufficient irrigation is required. Insufficient irrigation results in the vertical vegetation requiring more energy to cool the building. To conclude, the choice of design parameters for vertical vegetation is crucial in making sure that it contributes to energy savings rather than energy consumption. Optimal design decisions can create a dramatic sustainability enhancement for the built environment in subtropical climates.
Resumo:
1. The phylogeography of freshwater taxa is often integrally linked with landscape changes such as drainage re-alignments that may present the only avenue for historical dispersal for these taxa. Classical models of gene flow do not account for landscape changes and so are of little use in predicting phylogeography in geologically young freshwater landscapes. When the history of drainage formation is unknown, phylogeographical predictions can be based on current freshwater landscape structure, proposed historical drainage geomorphology, or from phylogeographical patterns of co-distributed taxa. 2. This study describes the population structure of a sedentary freshwater fish, the chevron snakehead (Channa striata), across two river drainages on the Indochinese Peninsula. The phylogeographical pattern recovered for C. striata was tested against seven hypotheses based on contemporary landscape structure, proposed history and phylogeographical patterns of codistributed taxa. 3. Consistent with the species ecology, analysis of mitochondrial and microsatellite loci revealed very high differentiation among all sampled sites. A strong signature of historical population subdivision was also revealed within the contemporary Mekong River Basin (MRB). Of the seven phylogeographical hypotheses tested, patterns of co-distributed taxa proved to be the most adequate for describing the phylogeography of C. striata. 4. Results shed new light on SE Asian drainage evolution, indicating that the Middle MRB probably evolved via amalgamation of at least three historically independent drainage sections and in particular that the Mekong River section centred around the northern Khorat Plateau in NE Thailand was probably isolated from the greater Mekong for an extensive period of evolutionary time. In contrast, C. striata populations in the Lower MRB do not show a phylogeographical signature of evolution in historically isolated drainage lines, suggesting drainage amalgamation has been less important for river landscape formation in this region.
Resumo:
Problems involving the solution of advection-diffusion-reaction equations on domains and subdomains whose growth affects and is affected by these equations, commonly arise in developmental biology. Here, a mathematical framework for these situations, together with methods for obtaining spatio-temporal solutions and steady states of models built from this framework, is presented. The framework and methods are applied to a recently published model of epidermal skin substitutes. Despite the use of Eulerian schemes, excellent agreement is obtained between the numerical spatio-temporal, numerical steady state, and analytical solutions of the model.
Resumo:
This article presents a case study of corporate dialogue with vulnerable others. Dialogue with marginalized external groups is increasingly presented in the business literature as the key to making corporate social responsibility possible in particular through corporate learning. Corporate public communications at the same time promote community engagement as a core aspect of corporate social responsibility. This article examines the possibilities for and conditions underpinning corporate dialogue with marginalized stakeholders as occurred around the unexpected and sudden closure in January 2009 of the AU$2.2 billion BHP Billiton Ravensthorpe Nickel mine in rural Western Australia. In doing so we draw on John Roberts’ notion of dialogue with vulnerable others, and apply a discourse analysis approach to data spanning corporate public communications and interviews with residents affected by the decision to close the mine. In presenting this case study we contribute to the as yet limited organizational research concerned directly with marginalized stakeholders and argue that corporate social responsibility discourse and vulnerable other dialogue not only affirms the primacy of business interests but also co-opts vulnerable others in the pursuit of these interests. In conclusion we consider case study implications for critical understandings of corporate dialogue with vulnerable others.
Resumo:
Background: Tenofovir has been associated with renal phosphate wasting, reduced bone mineral density, and higher parathyroid hormone levels. The aim of this study was to carry out a detailed comparison of the effects of tenofovir versus non-tenofovir use on calcium, phosphate and, vitamin D, parathyroid hormone (PTH), and bone mineral density. Methods: A cohort study of 56 HIV-1 infected adults at a single centre in the UK on stable antiretroviral regimes comparing biochemical and bone mineral density parameters between patients receiving either tenofovir or another nucleoside reverse transcriptase inhibitor. Principal Findings: In the unadjusted analysis, there was no significant difference between the two groups in PTH levels (tenofovir mean 5.9 pmol/L, 95% confidence intervals 5.0 to 6.8, versus non-tenofovir; 5.9, 4.9 to 6.9; p = 0.98). Patients on tenofovir had significantly reduced urinary calcium excretion (median 3.01 mmol/24 hours) compared to non-tenofovir users (4.56; p,0.0001). Stratification of the analysis by age and ethnicity revealed that non-white men but not women, on tenofovir had higher PTH levels than non-white men not on tenofovir (mean difference 3.1 pmol/L, 95% CI 5.3 to 0.9; p = 0.007). Those patients with optimal 25-hydroxyvitamin D (.75 nmol/L) on tenofovir had higher 1,25-dihydroxyvitamin D [1,25(OH)2D] (median 48 pg/mL versus 31; p = 0.012), fractional excretion of phosphate (median 26.1%, versus 14.6;p = 0.025) and lower serum phosphate (median 0.79 mmol/L versus 1.02; p = 0.040) than those not taking tenofovir. Conclusions: The effects of tenofovir on PTH levels were modified by sex and ethnicity in this cohort. Vitamin D status also modified the effects of tenofovir on serum concentrations of 1,25(OH)2D and phosphate.
Resumo:
Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.
Resumo:
Scope: We examined whether dietary supplementation with fish oil modulates inflammation, fibrosis and oxidative stress following obstructive renal injury. Methods and results: Three groups of Sprague-Dawley rats (n = 16 per group) were fed for 4 wk on normal rat chow (oleic acid), chow containing fish oil (33 g eicosapentaenoic acid and 26 g docosahexaenoic acid per kg diet), or chow containing safflower oil (60 g linoleic acid per kg diet). All diets contained 7% fat. After 4 wk, the rats were further subdivided into four smaller groups (n = 4 per group). Unilateral ureteral obstruction was induced in three groups (for 4, 7 and 14 days). The fourth group for each diet did not undergo surgery, and was sacrificed as controls at 14 days. When rats were sacrificed, plasma and portions of the kidneys were removed and frozen; other portions of kidney tissue were fixed and prepared for histology. Compared with normal chow and safflower oil, fish oil attenuated collagen deposition, macrophage infiltration, TGF-beta expression, apoptosis, and tissue levels of arachidonic acid, MIP-1 alpha, IL-1 beta, MCP-1 and leukotriene B(4). Compared with normal chow, fish oil increased the expression of HO-1 protein in kidney tissue. Conclusions: Fish oil intake reduced inflammation, fibrosis and oxidative stress following obstructive renal injury.
Resumo:
Objective: The expedited 10g protein counter (EP-10) is a quick and valid clinical tool for dietary protein quantification. This study aims to assess the clinical effectiveness of the EP-10 in improving serum albumin and transferrin in chronic hemodialysis patients. Methods: Forty-five patients with low serum albumin (< 38 g /L) were enrolled in this study. Parameters measured included dry weight, height, dietary intake, and levels of serum albumin, transferrin, potassium, phosphate and kinetic modeling (Kt/v). The nutritional intervention incorporated the EP-10 in two ways (1)lto quantify protein intake of patients and (2)ito educate patients to meet their protein requirements. Mean values of the nutritional parameters before and after intervention were compared using paired t-test. Results: Three months after nutritional intervention, mean albumin levels increased significantly from 32.2+4.8g/L to 37.0+3.2g/L (p<0.001). Thirty-eight (84%) patients showed an increase in albumin levels while two (4%) maintained their levels. Of the thirty-six (80%) patients with low transferrin levels (<200 mg/dL), 28 (78%) had an increase and two maintained their levels post-intervention. Mean transferrin levels increased significantly from 169.4+39.9mg/dL to 180.9+38.1mg/dL (p< 0.05). Conclusion: Nutritional intervention incorporating the EP-10 method is able to make significant improvements to albumin and transferrin levels of chronic hemodialysis patients.