872 resultados para Intensive aquaculture
Resumo:
We evaluate the profitability and technical efficiency of aquaculture in the Philippines. Farm-level data are used to compare two production systems corresponding to the intensive monoculture of tilapia in freshwater ponds and the extensive polyculture of shrimps and fish in brackish water ponds. Both activities are very lucrative, with brackish water aquaculture achieving the higher level of profit per farm. Stochastic frontier production functions reveal that technical efficiency is low in brackish water aquaculture, with a mean of 53%, explained primarily by the operator's experience and by the frequency of his visits to the farm. In freshwater aquaculture, the farms achieve a mean efficiency level of 83%. The results suggest that the provision of extension services to brackish water fish farms might be a cost-effective way of increasing production and productivity in that sector. By contrast, technological change will have to be the driving force of future productivity growth in freshwater aquaculture.
Resumo:
The hydrodynamic characterization and the performance evaluation of an aerobic three phase fluidized bed reactor in wastewater fish culture treatment are presented in this report. The objective of this study was to evaluate the organic matter, nitrogen and phosphorous removal efficiency in a physical and biological wastewater treatment system of an intensive Nile Tilapia laboratory production with recirculation. The treatment system comprised of a conventional sedimentation basin operated at a hydraulic detention time HDT of 2.94 h and an aerobic three phase airlift fluidized bed reactor AAFBR operated at an 11.9 min HDT. Granular activated carbon was used as support media with density of 1.64 g/cm(3) and effective size of 0.34 mm in an 80 g/L constant concentration. Mean removal efficiencies of BOD, COD, phosphorous, total ammonia nitrogen and total nitrogen were 47%, 77%, 38%, 27% and 24%, respectively. The evaluated system proved an effective alternative for water reuse in the recirculation system capable of maintaining water quality characteristics within the recommended values for fish farming and met the Brazilian standards for final effluent discharges with exception of phosphorous values. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Four 0.02-ha earthen ponds at the UNESP Aquaculture Center, Jaboticabal, São Paulo, Brazil, were stocked with newly metamorphosed Macrobrachium rosenbergii post-larvae at 1.5 animals/m2. After 8 mo, prawn density at harvest ranged from 0.3/ m2 to 0.8/m2. Growth curves were determined for each population using von Bertalanffy growth functions. Asymptotic maximum length and asymptotic maximum weight increased as final population size decreased indicating that a strong density effect on prawn growth occurs in semi-intensive culture, even when populational density varies within a small range of less than 1 animal/m2.
Resumo:
One of the key environmental concerns about shrimp farming is the discharge of waters with high levels of nutrients and suspended solids into adjacent waterways. In this paper we synthesize the results of our multidisciplinary research linking ecological processes in intensive shrimp ponds with their downstream impacts in tidal, mangrove-lined creeks. The incorporation of process measurements and bioindicators, in addition to water quality measurements, improved our understanding of the effect of shrimp farm discharges on the ecological health of the receiving water bodies. Changes in water quality parameters were an oversimplification of the ecological effects of water discharges, and use of key measures including primary production rates, phytoplankton responses to nutrients, community shifts in zooplankton and delta(15)N ratios in marine plants have the potential to provide more integrated and robust measures. Ultimately, reduction in nutrient discharges is most likely to ensure the future sustainability of the industry. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Carbon and nitrogen stable isotope values were determined in Pacific white shrimp (Litopenaeus vannamei) with the objective of discriminating animals produced through aquaculture practices from those extracted from the wild. Farmed animals were collected at semi-intensive shrimp farms in Mexico and Ecuador. Fisheries-derived shrimps were caught in different fishing areas representing two estuarine systems and four open sea locations in Mexico and Ecuador. Carbon and nitrogen stable isotope values (13CVPDB and 15NAIR) allowed clear differentiation of wild from farmed animals. 13CVPDB and 15NAIR values in shrimps collected in the open sea were isotopically enriched (−16.99‰ and 11.57‰), indicating that these organisms belong to higher trophic levels than farmed animals. 13CVPDB and 15NAIR values of farmed animals (−19.72‰ and 7.85‰, respectively) partially overlapped with values measured in animals collected in estuaries (−18.46‰ and 5.38‰, respectively). Canonical discriminant analysis showed that when used separately and in conjunction, 13CVPDB and I5NAIR values were powerful discriminatory variables and demonstrate the viability of isotopic evaluations to distinguish wild-caught shrimps from aquaculture shrimps. Methodological improvements will define a verification tool to support shrimp traceability protocols.
Resumo:
We report a case of a 67 year-old-male patient admitted to the intensive care unit in the post-coronary bypass surgery period who presented cardiogenic shock, acute renal failure and three episodes of sepsis, the latter with pulmonary distress at the 30th post-operative day. The patient expired within five days in spite of treatment with vancomycin, imipenem, colistimethate and amphotericin B. At autopsy severe adenovirus pneumonia was found. Viral pulmonary infections following cardiovascular surgery are uncommon. We highlight the importance of etiological diagnosis to a correct treatment approach.
Resumo:
In order to assess the prevalence of and risk factors for aminoglycoside-associated nephrotoxicity in intensive care units (ICUs), we evaluated 360 consecutive patients starting aminoglycoside therapy in an ICU. The patients had a baseline calculated glomerular filtration rate (cGFR) of ?30 ml/min/1.73 m2. Among these patients, 209 (58 per cent) developed aminoglycoside-associated nephrotoxicity (the acute kidney injury [AKI] group, which consisted of individuals with a decrease in cGFR of >20 per cent from the baseline cGFR), while 151 did not (non-AKI group). Both groups had similar baseline cGFRs. The AKI group developed a lower cGFR nadir (45 ± 27 versus 79 ± 39 ml/min/1.73 m2 for the non-AKI group; P < 0.001); was older (56 ± 18 years versus 52 ± 19 years for the non-AKI group; P = 0.033); had a higher prevalence of diabetes (19.6 per cent versus 9.3 per cent for the non-AKI group; P = 0.007); was more frequently treated with other nephrotoxic drugs (51 per cent versus 38 per cent for the non-AKI group; P = 0.024); used iodinated contrast more frequently (18 per cent versus 8 per cent for the non-AKI group; P = 0.0054); and showed a higher prevalence of hypotension (63 per cent versus 44 per cent for the non-AKI group; P = 0.0003), shock (56 per cent versus 31 per cent for the non-AKI group; P < 0.0001), and jaundice (19 per cent versus 8 per cent for the non-AKI group; P = 0.0036). The mortality rate was 44.5 per cent for the AKI group and 29.1 per cent for the non-AKI group (P = 0.0031). A logistic regression model identified as significant (P < 0.05) the following independent factors that affected aminoglycoside-associated nephrotoxicity: a baseline cGFR of <60 ml/min/1.73 m2 (odds ratio [OR], 0.42), diabetes (OR, 2.13), treatment with other nephrotoxins (OR, 1.61) or iodinated contrast (OR, 2.13), and hypotension (OR, 1.83). (To continue) In conclusion, AKI was frequent among ICU patients receiving an aminoglycoside, and it was associated with a high rate of mortality. The presence of diabetes or hypotension and the use of other nephrotoxic drugs and iodinated contrast were independent risk factors for the development of aminoglycoside-associated nephrotoxicity
Resumo:
The aim of this study was to develop the concept of the dignified death of children in Brazilian pediatric intensive care units (PICUs). The Hybrid Model for Concept Development was used to develop a conceptual structure of dignified death in PICUs in an attempt to define the concept. The fieldwork study was carried out by means of in-depth interviews with nine nurses and seven physicians working in PICUs. Not unexpectedly, the concept of dignified death was found to be a complex phenomenon involving aspects related to decisions made by the multidisciplinary team as well as those related to care of the child and the family. Knowledge of the concept`s dimensions can promote reflection on the part of healthcare professionals regarding the values and beliefs underlying their conduct in end-of-life situations. Our hope is that this study may contribute to theoretic and methodological development in the area of end-of-life care.
Resumo:
Purpose Adverse drug events (ADEs) are harmful and occur with alarming frequency in critically ill patients. Complex pharmacotherapy with multiple medications increases the probability of a drug interaction (DI) and ADEs in patients in intensive care units (ICUs). The objective of the study is to determine the frequency of ADEs among patients in the ICU of a university hospital and the drugs implicated. Also, factors associated with ADEs are investigated. Methods This cross-sectional study investigated 299 medical records of patients hospitalized for 5 or more days in an ICU. ADEs were identified through intensive monitoring adopted in hospital pharmacovigilance and also ADE triggers. Adverse drug reactions (ADR) causality was classified using the Naranjo algorithm. Data were analyzed through descriptive analysis, and through univariate and multiple logistic regression. Results The most frequent ADEs were ADRs type A, of possible causality and moderate severity. The most frequent ADR was drug-induced acute kidney injury. Patients with ADEs related to DIs corresponded to 7% of the sample. The multiple logistic regression showed that length of hospitalization (OR = 1.06) and administration of cardiovascular drugs (OR = 2.2) were associated with the occurrence of ADEs. Conclusion Adverse drug reactions of clinical significance were the most frequent ADEs in the ICU studied, which reduces patient safety. The number of ADEs related to drug interactions was small, suggesting that clinical manifestations of drug interactions that harm patients are not frequent in ICUs.
Resumo:
Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.
Resumo:
The standards in this chapter focus on maximising the patient`s ability to adhere to the treatment prescribed. Many people are extremely shocked when they are told they have TB, some refuse to accept it and others are relieved to find out what is wrong and that treatment is available. The reaction depends on many factors, including cultural beliefs and values, previous experience and knowledge of the disease. Even though TB is more common among vulnerable groups, it can affect anyone and it is important for patients to be able to discuss their concerns in relation to their own individual context. The cure for TB relies on the patient receiving a full, uninterrupted course of treatment, which can only be achieved if the patient and the health service work together. A system needs to be in place to trace patients who miss their appointments for treatment (late patients). The best success will be achieved through the use of flexible, innovative and individualised approaches. The treatment and care the patient has received will inevitably have an impact on his or her willingness to attend in the future. A well-defined system of late patient tracing is mandatory in all situations. However, when the rates are high (above 10%), any tracing system will be useless without also examining the service as a whole.
Resumo:
BACKGROUND: Guidelines for red blood cell (RBC) transfusions exist; however, transfusion practices vary among centers. This study aimed to analyze transfusion practices and the impact of patients and institutional characteristics on the indications of RBC transfusions in preterm infants. STUDY DESIGN AND METHODS: RBC transfusion practices were investigated in a multicenter prospective cohort of preterm infants with a birth weight of less than 1500 g born at eight public university neonatal intensive care units of the Brazilian Network on Neonatal Research. Variables associated with any RBC transfusions were analyzed by logistic regression analysis. RESULTS: Of 952 very-low-birth-weight infants, 532 (55.9%) received at least one RBC transfusion. The percentages of transfused neonates were 48.9, 54.5, 56.0, 61.2, 56.3, 47.8, 75.4, and 44.7%, respectively, for Centers 1 through 8. The number of transfusions during the first 28 days of life was higher in Center 4 and 7 than in other centers. After 28 days, the number of transfusions decreased, except for Center 7. Multivariate logistic regression analysis showed higher likelihood of transfusion in infants with late onset sepsis (odds ratio [OR], 2.8; 95% confidence interval [CI], 1.8-4.4), intraventricular hemorrhage (OR, 9.4; 95% CI, 3.3-26.8), intubation at birth (OR, 1.7; 95% CI, 1.0-2.8), need for umbilical catheter (OR, 2.4; 95% CI, 1.3-4.4), days on mechanical ventilation (OR, 1.1; 95% CI, 1.0-1.2), oxygen therapy (OR, 1.1; 95% CI, 1.0-1.1), parenteral nutrition (OR, 1.1; 95% CI, 1.0-1.1), and birth center (p < 0.001). CONCLUSIONS: The need of RBC transfusions in very-low-birth-weight preterm infants was associated with clinical conditions and birth center. The distribution of the number of transfusions during hospital stay may be used as a measure of neonatal care quality.
Resumo:
The sustainability of fast-growing tropical Eucalyptus plantations is of concern in a context of rising fertilizer costs, since large amounts of nutrients are removed with biomass every 6-7 years from highly weathered soils. A better understanding of the dynamics of tree requirements is required to match fertilization regimes to the availability of each nutrient in the soil. The nutrition of Eucalyptus plantations has been intensively investigated and many studies have focused on specific fluxes in the biogeochemical cycles of nutrients. However, studies dealing with complete cycles are scarce for the Tropics. The objective of this paper was to compare these cycles for Eucalyptus plantations in Congo and Brazil, with contrasting climates, soil properties, and management practices. The main features were similar in the two situations. Most nutrient fluxes were driven by crown establishment the two first years after planting and total biomass production thereafter. These forests were characterized by huge nutrient requirements: 155, 10, 52, 55 and 23 kg ha(-1) of N, P, K, Ca and Mg the first year after planting at the Brazilian study site, respectively. High growth rates the first months after planting were essential to take advantage of the large amounts of nutrients released into the soil solutions by organic matter mineralization after harvesting. This study highlighted the predominant role of biological and biochemical cycles over the geochemical cycle of nutrients in tropical Eucalyptus plantations and indicated the prime importance of carefully managing organic matter in these soils. Limited nutrient losses through deep drainage after clear-cutting in the sandy soils of the two study sites showed the remarkable efficiency of Eucalyptus trees in keeping limited nutrient pools within the ecosystem, even after major disturbances. Nutrient input-output budgets suggested that Eucalyptus plantations take advantage of soil fertility inherited from previous land uses and that long-term sustainability will require an increase in the inputs of certain nutrients. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Only 7% of the once extensive forest along the eastern coast of Brazil remains, and much of that is degraded and threatened by agricultural expansion and urbanization. We wondered if methods similar to those developed to establish fast-growing Eucalyptus plantations might also work to enhance survival and growth of rainforest species on degraded pastures composed of highly competitive C(4) grasses. An 8-factor experiment was laid out to contrast the value of different intensities of cultivation, application of fertilizer and weed control on the growth and survival of a mixture of 20 rainforest species planted at two densities: 3 m x 1 m, and 3 m x 2 m. Intensive management increased seedling survival from 90% to 98%, stemwood production and leaf area index (LAI) by similar to 4-fold, and stemwood production per unit of light absorbed by 30%. Annual growth in stem biomass was closely related to LAI alone (r(2) = 0.93, p < 0.0001), and the regression improved further in combination with canopy nitrogen content (r(2) =0.99, p < 0.0001). Intensive management resulted in a nearly closed forest canopy in less than 4 years, and offers a practical means to establish functional forests on abandoned agricultural land. (C) 2009 Elsevier B.V. All rights reserved.