898 resultados para Infant Mortality Rate


Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE Anorexia nervosa is associated with several serious medical complications related to malnutrition, severe weight loss, and low levels of micronutrients. The refeeding phase of these high-risk patients bears a further threat to health and potentially fatal complications. The objective of this study was to examine complications due to refeeding of patients with anorexia nervosa, as well as their mortality rate after the implementation of guidelines from the European Society of Clinical Nutrition and Metabolism. METHODS We analyzed retrospective, observational data of a consecutive, unselected anorexia nervosa cohort during a 5-y period. The sample consisted of 65 inpatients, 14 were admitted more than once within the study period, resulting in 86 analyzed cases. RESULTS Minor complications associated with refeeding during the first 10 d (replenishing phase) were recorded in nine cases (10.5%), four with transient pretibial edemas and three with organ dysfunction. In two cases, a severe hypokalemia occurred. During the observational phase of 30 d, 16 minor complications occurred in 14 cases (16.3%). Six infectious and 10 non-infectious complications occurred. None of the patients with anorexia nervosa died within a follow-up period of 3 mo. CONCLUSIONS Our data demonstrate that the seriousness and rate of complications during the replenishment phase in this high-risk population can be kept to a minimum. The findings indicate that evidence-based refeeding regimens, such as our guidelines are able to reduce complications and prevent mortality. Despite anorexia nervosa, our sample were affected by serious comorbidities, no case met the full diagnostic criteria for refeeding syndrome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Acute mesenteric ischemia (AMI) is an emergency with a mortality rate up to 50 %. Detecting AMI continues to be a major challenge. This study assed the correlation of repeated preoperative serum lactate with bowel necrosis and to identify risk factors for a lethal outcome in patients with AMI. METHODS A retrospective study of 91 patients with clinically and pathologically confirmed AMI from January 2006 to December 2012 was performed. RESULTS In-hospital mortality rate was 42.9 %. Two hundred nine preoperative lactate measurements were analyzed (2.3 ± 1.1 values per patient). Less than or equal to six hours prior to surgery, the mean serum lactate level was significantly higher (4.97 ± 4.21 vs. 3.24 ± 3.05 mmol/L, p = 0.006) and the mean pH significantly lower (7.28 ± 0.12 vs. 7.37 ± 0.08, p = 0.001) compared to >6 h before surgery. Thirty-four patients had at least two lactate measurements within 24 h prior to surgery. In this subgroup, 17 patients (50 %) exhibited an increase, 17 patients (50 %) a decrease in lactate levels. Forward logistic regression analysis showed that length of necrotic bowel and the highest lactate value 24 h prior to surgery were independent risk factors for mortality (r (2)  = 0.329). CONCLUSION The value of serial lactate and pH measurements to predict the length of necrotic bowel is very limited. Length of necrotic bowel and lactate values are independent risk factors for mortality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Intrauterine growth restriction (IUGR) occurs in up to 10% of pregnancies and is considered as a major risk to develop various diseases in adulthood, such as cardiovascular diseases, insulin resistance, hypertension or end stage kidney disease. Several IUGR models have been developed in order to understand the biological processes linked to fetal growth retardation, most of them being rat or mouse models and nutritional models. In order to reproduce altered placental flow, surgical models have also been developed, and among them bilateral uterine ligation has been frequently used. Nevertheless, this model has never been developed in the mouse, although murine tools display multiple advantages for biological research. The aim of this work was therefore to develop a mouse model of bilateral uterine ligation as a surgical model of IUGR. RESULTS In this report, we describe the set up and experimental data obtained from three different protocols (P1, P2, P3) of bilateral uterine vessel ligation in the mouse. Ligation was either performed at the cervical end of each uterine horn (P1) or at the central part of each uterine horn (P2 and P3). Time of surgery was E16 (P1), E17 (P2) or E16.5 (P3). Mortality, maternal weight and abortion parameters were recorded, as well as placentas weights, fetal resorption, viability, fetal weight and size. Results showed that P1 in test animals led to IUGR but was also accompanied with high mortality rate of mothers (50%), low viability of fetuses (8%) and high resorption rate (25%). P2 and P3 improved most of these parameters (decreased mortality and improved pregnancy outcomes; improved fetal viability to 90% and 27%, respectively) nevertheless P2 was not associated to IUGR contrary to P3. Thus P3 experimental conditions enable IUGR with better pregnancy and fetuses outcomes parameters that allow its use in experimental studies. CONCLUSIONS Our results show that bilateral uterine artery ligation according to the protocol we have developed and validated can be used as a surgical mouse model of IUGR.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Pneumocystis jiroveci pneumonia (PCP) remains the most common opportunistic infection in patients infected with the human immunodeficiency virus (HIV). Among patients with HIV infection and PCP the mortality rate is 10% to 20% during the initial infection and this increases substantially with the need for mechanical ventilation. It has been suggested that corticosteroids adjunctive to standard treatment for PCP could prevent the need for mechanical ventilation and decrease mortality in these patients. OBJECTIVES To assess the effects of adjunctive corticosteroids on overall mortality and the need for mechanical ventilation in HIV-infected patients with PCP and substantial hypoxaemia (arterial oxygen partial pressure < 70 mmHg or alveolar-arterial gradient > 35 mmHg on room air). SEARCH METHODS For the original review we searched The Cochrane Library (2004, Issue 4), MEDLINE (January 1980 to December 2004) and EMBASE (January 1985 to December 2004) without language restrictions. We further reviewed the reference lists from previously published overviews, searched UptoDate version 2005 and Clinical Evidence Concise (Issue 12, 2004), contacted experts in the field and searched the reference lists of identified publications for citations of additional relevant articles.In this update of our review, we searched the above-mentioned databases in September 2010 and April 2014 for trials published since our original review. We also searched for ongoing trials in ClinicalTrials.gov and the World Health Organization International Clinical Trial Registry Platform (ICTRP). We searched for conference abstracts via AEGIS. SELECTION CRITERIA Randomised controlled trials that compared corticosteroids to placebo or usual care in HIV-infected patients with PCP in addition to baseline treatment with trimethoprim-sulfamethoxazole, pentamidine or dapsone-trimethoprim, and reported mortality data. We excluded trials in patients with no or mild hypoxaemia (arterial oxygen partial pressure > 70 mmHg or an alveolar-arterial gradient < 35 mmHg on room air) and trials with a follow-up of less than 30 days. DATA COLLECTION AND ANALYSIS Two teams of review authors independently evaluated the methodology and extracted data from each primary study. We pooled treatment effects across studies and calculated a weighted average risk ratio of overall mortality in the treatment and control groups using a random-effects model.In this update of our review, we used the GRADE methodology to assess evidence quality. MAIN RESULTS Of 2029 screened records, we included seven studies in the review and six in the meta-analysis. Risk of bias varied: the randomisation and allocation process was often not clearly described, five of seven studies were double-blind and there was almost no missing data. The quality of the evidence for mortality was high. Risk ratios for overall mortality for adjunctive corticosteroids were 0.56 (95% confidence interval (CI) 0.32 to 0.98) at one month and 0.59 (95% CI 0.41 to 0.85) at three to four months of follow-up. In adults, to prevent one death, numbers needed to treat are nine patients in a setting without highly active antiretroviral therapy (HAART) available, and 23 patients with HAART available. The three largest trials provided moderate quality data on the need for mechanical ventilation, with a risk ratio of 0.38 (95% CI 0.20 to 0.73) in favour of adjunctive corticosteroids. One study was conducted in infants, suggesting a risk ratio for death in hospital of 0.81 (95% CI 0.51 to 1.29; moderate quality evidence). AUTHORS' CONCLUSIONS The number and size of trials investigating adjunctive corticosteroids for HIV-infected patients with PCP is small, but the evidence from this review suggests a beneficial effect for adult patients with substantial hypoxaemia. There is insufficient evidence on the effect of adjunctive corticosteroids on survival in infants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Many orthopaedic surgical procedures can be performed with either regional or general anesthesia. We hypothesized that total hip arthroplasty with regional anesthesia is associated with less postoperative morbidity and mortality than total hip arthroplasty with general anesthesia. METHODS This retrospective propensity-matched cohort study utilizing the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database included patients who had undergone total hip arthroplasty from 2007 through 2011. After matching, logistic regression was used to determine the association between the type of anesthesia and deep surgical site infections, hospital length of stay, thirty-day mortality, and cardiovascular and pulmonary complications. RESULTS Of 12,929 surgical procedures, 5103 (39.5%) were performed with regional anesthesia. The adjusted odds for deep surgical site infections were significantly lower in the regional anesthesia group than in the general anesthesia group (odds ratio [OR] = 0.38; 95% confidence interval [CI] = 0.20 to 0.72; p < 0.01). The hospital length of stay (geometric mean) was decreased by 5% (95% CI = 3% to 7%; p < 0.001) with regional anesthesia, which translates to 0.17 day for each total hip arthroplasty. Regional anesthesia was also associated with a 27% decrease in the odds of prolonged hospitalization (OR = 0.73; 95% CI = 0.68 to 0.89; p < 0.001). The mortality rate was not significantly lower with regional anesthesia (OR = 0.78; 95% CI = 0.43 to 1.42; p > 0.05). The adjusted odds for cardiovascular complications (OR = 0.61; 95% CI = 0.44 to 0.85) and respiratory complications (OR = 0.51; 95% CI = 0.33 to 0.81) were all lower in the regional anesthesia group. CONCLUSIONS Compared with general anesthesia, regional anesthesia for total hip arthroplasty was associated with a reduction in deep surgical site infection rates, hospital length of stay, and rates of postoperative cardiovascular and pulmonary complications. These findings could have an important medical and economic impact on health-care practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Community acquired pneumonia (CAP) is the most common infectious reason for admission to the Intensive Care Unit (ICU). The GenOSept study was designed to determine genetic influences on sepsis outcome. Phenotypic data was recorded using a robust clinical database allowing a contemporary analysis of the clinical characteristics, microbiology, outcomes and independent risk factors in patients with severe CAP admitted to ICUs across Europe. METHODS Kaplan-Meier analysis was used to determine mortality rates. A Cox Proportional Hazards (PH) model was used to identify variables independently associated with 28-day and six-month mortality. RESULTS Data from 1166 patients admitted to 102 centres across 17 countries was extracted. Median age was 64 years, 62% were male. Mortality rate at 28 days was 17%, rising to 27% at six months. Streptococcus pneumoniae was the commonest organism isolated (28% of cases) with no organism identified in 36%. Independent risk factors associated with an increased risk of death at six months included APACHE II score (hazard ratio, HR, 1.03; confidence interval, CI, 1.01-1.05), bilateral pulmonary infiltrates (HR1.44; CI 1.11-1.87) and ventilator support (HR 3.04; CI 1.64-5.62). Haematocrit, pH and urine volume on day one were all associated with a worse outcome. CONCLUSIONS The mortality rate in patients with severe CAP admitted to European ICUs was 27% at six months. Streptococcus pneumoniae was the commonest organism isolated. In many cases the infecting organism was not identified. Ventilator support, the presence of diffuse pulmonary infiltrates, lower haematocrit, urine volume and pH on admission were independent predictors of a worse outcome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Faecal peritonitis (FP) is a common cause of sepsis and admission to the intensive care unit (ICU). The Genetics of Sepsis and Septic Shock in Europe (GenOSept) project is investigating the influence of genetic variation on the host response and outcomes in a large cohort of patients with sepsis admitted to ICUs across Europe. Here we report an epidemiological survey of the subset of patients with FP. OBJECTIVES To define the clinical characteristics, outcomes and risk factors for mortality in patients with FP admitted to ICUs across Europe. METHODS Data was extracted from electronic case report forms. Phenotypic data was recorded using a detailed, quality-assured clinical database. The primary outcome measure was 6-month mortality. Patients were followed for 6 months. Kaplan-Meier analysis was used to determine mortality rates. Cox proportional hazards regression analysis was employed to identify independent risk factors for mortality. RESULTS Data for 977 FP patients admitted to 102 centres across 16 countries between 29 September 2005 and 5 January 2011 was extracted. The median age was 69.2 years (IQR 58.3-77.1), with a male preponderance (54.3%). The most common causes of FP were perforated diverticular disease (32.1%) and surgical anastomotic breakdown (31.1%). The ICU mortality rate at 28 days was 19.1%, increasing to 31.6% at 6 months. The cause of FP, pre-existing co-morbidities and time from estimated onset of symptoms to surgery did not impact on survival. The strongest independent risk factors associated with an increased rate of death at 6 months included age, higher APACHE II score, acute renal and cardiovascular dysfunction within 1 week of admission to ICU, hypothermia, lower haematocrit and bradycardia on day 1 of ICU stay. CONCLUSIONS In this large cohort of patients admitted to European ICUs with FP the 6 month mortality was 31.6%. The most consistent predictors of mortality across all time points were increased age, development of acute renal dysfunction during the first week of admission, lower haematocrit and hypothermia on day 1 of ICU admission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Pulmonary hypertension (PH) frequently coexists with severe aortic stenosis, and PH severity has been shown to predict outcomes after transcatheter aortic valve implantation (TAVI). The effect of PH hemodynamic presentation on clinical outcomes after TAVI is unknown. METHODS AND RESULTS Of 606 consecutive patients undergoing TAVI, 433 (71.4%) patients with severe aortic stenosis and a preprocedural right heart catheterization were assessed. Patients were dichotomized according to whether PH was present (mean pulmonary artery pressure, ≥25 mm Hg; n=325) or not (n=108). Patients with PH were further dichotomized by left ventricular end-diastolic pressure into postcapillary (left ventricular end-diastolic pressure, >15 mm Hg; n=269) and precapillary groups (left ventricular end-diastolic pressure, ≤15 mm Hg; n=56). Finally, patients with postcapillary PH were divided into isolated (n=220) and combined (n=49) subgroups according to whether the diastolic pressure difference (diastolic pulmonary artery pressure-left ventricular end-diastolic pressure) was normal (<7 mm Hg) or elevated (≥7 mm Hg). Primary end point was mortality at 1 year. PH was present in 325 of 433 (75%) patients and was predominantly postcapillary (n=269/325; 82%). Compared with baseline, systolic pulmonary artery pressure immediately improved after TAVI in patients with postcapillary combined (57.8±14.1 versus 50.4±17.3 mm Hg; P=0.015) but not in those with precapillary (49.0±12.6 versus 51.6±14.3; P=0.36). When compared with no PH, a higher 1-year mortality rate was observed in both precapillary (hazard ratio, 2.30; 95% confidence interval, 1.02-5.22; P=0.046) and combined (hazard ratio, 3.15; 95% confidence interval, 1.43-6.93; P=0.004) but not isolated PH patients (P=0.11). After adjustment, combined PH remained a strong predictor of 1-year mortality after TAVI (hazard ratio, 3.28; P=0.005). CONCLUSIONS Invasive stratification of PH according to hemodynamic presentation predicts acute response to treatment and 1-year mortality after TAVI.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION The pentasaccharide fondaparinux is widely approved for prophylaxis and treatment of thromboembolic diseases and therapy of acute coronary syndrome. It is also used off-label in patients with acute, suspected or antecedent heparin-induced thrombocytopenia (HIT). The aim of this prospective observational cohort study was to document fondaparinux' prescription practice, tolerance and therapy safety in a representative mixed German single-centre patient cohort. PATIENTS AND METHODS Between 09/2008 - 04/2009, 231 consecutive patients treated with fondaparinux were enrolled. Medical data were obtained from patient's records. The patients were clinically screened for thrombosis (Wells score), sequelae of HIT (4T's score), and bleeding complications (ISTH-criteria) and subjected to further assessment (i.e. sonography, HIT-diagnostics), if necessary. The mortality rate was assessed 30 days after therapy start. RESULTS Overall, 153/231 patients had a prophylactic, 74/231 patients a therapeutic, and 4/231 patients a successive prophylactic/therapeutic indication. In 11/231 patients fondaparinux was used due to suspected/antecedent HIT, in 5/231 patients due to a previous cutaneous delayed-type hypersensitivity to heparins. Other indications were rare. Three new/progressive thromboses were detected. No cases of HIT, major bleedings, or fatalities occurred. CONCLUSIONS Fondaparinux was well tolerated and was safe in prophylaxis and therapy; prescriptions mostly followed the current approval guidelines and were rarely related to HIT-associated indications (<5% of prescriptions), which is in contrast to previous study results in the U.S. (>94% of prescriptions were HIT-associated). A trend towards an individualised fondaparinux use based on the compound's inherent properties and the patients' risk profiles, i.e., antecedent HIT, bone fractures, heparin allergy, was observed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Febrile neutropenia (FN) and other infectious complications are some of the most serious treatment-related toxicities of chemotherapy for cancer, with a mortality rate of 2% to 21%. The two main types of prophylactic regimens are granulocyte (macrophage) colony-stimulating factors (G(M)-CSF) and antibiotics, frequently quinolones or cotrimoxazole. Current guidelines recommend the use of colony-stimulating factors when the risk of febrile neutropenia is above 20%, but they do not mention the use of antibiotics. However, both regimens have been shown to reduce the incidence of infections. Since no systematic review has compared the two regimens, a systematic review was undertaken. OBJECTIVES To compare the efficacy and safety of G(M)-CSF compared to antibiotics in cancer patients receiving myelotoxic chemotherapy. SEARCH METHODS We searched The Cochrane Library, MEDLINE, EMBASE, databases of ongoing trials, and conference proceedings of the American Society of Clinical Oncology and the American Society of Hematology (1980 to December 2015). We planned to include both full-text and abstract publications. Two review authors independently screened search results. SELECTION CRITERIA We included randomised controlled trials (RCTs) comparing prophylaxis with G(M)-CSF versus antibiotics for the prevention of infection in cancer patients of all ages receiving chemotherapy. All study arms had to receive identical chemotherapy regimes and other supportive care. We included full-text, abstracts, and unpublished data if sufficient information on study design, participant characteristics, interventions and outcomes was available. We excluded cross-over trials, quasi-randomised trials and post-hoc retrospective trials. DATA COLLECTION AND ANALYSIS Two review authors independently screened the results of the search strategies, extracted data, assessed risk of bias, and analysed data according to standard Cochrane methods. We did final interpretation together with an experienced clinician. MAIN RESULTS In this updated review, we included no new randomised controlled trials. We included two trials in the review, one with 40 breast cancer patients receiving high-dose chemotherapy and G-CSF compared to antibiotics, a second one evaluating 155 patients with small-cell lung cancer receiving GM-CSF or antibiotics.We judge the overall risk of bias as high in the G-CSF trial, as neither patients nor physicians were blinded and not all included patients were analysed as randomised (7 out of 40 patients). We considered the overall risk of bias in the GM-CSF to be moderate, because of the risk of performance bias (neither patients nor personnel were blinded), but low risk of selection and attrition bias.For the trial comparing G-CSF to antibiotics, all cause mortality was not reported. There was no evidence of a difference for infection-related mortality, with zero events in each arm. Microbiologically or clinically documented infections, severe infections, quality of life, and adverse events were not reported. There was no evidence of a difference in frequency of febrile neutropenia (risk ratio (RR) 1.22; 95% confidence interval (CI) 0.53 to 2.84). The quality of the evidence for the two reported outcomes, infection-related mortality and frequency of febrile neutropenia, was very low, due to the low number of patients evaluated (high imprecision) and the high risk of bias.There was no evidence of a difference in terms of median survival time in the trial comparing GM-CSF and antibiotics. Two-year survival times were 6% (0 to 12%) in both arms (high imprecision, low quality of evidence). There were four toxic deaths in the GM-CSF arm and three in the antibiotics arm (3.8%), without evidence of a difference (RR 1.32; 95% CI 0.30 to 5.69; P = 0.71; low quality of evidence). There were 28% grade III or IV infections in the GM-CSF arm and 18% in the antibiotics arm, without any evidence of a difference (RR 1.55; 95% CI 0.86 to 2.80; P = 0.15, low quality of evidence). There were 5 episodes out of 360 cycles of grade IV infections in the GM-CSF arm and 3 episodes out of 334 cycles in the cotrimoxazole arm (0.8%), with no evidence of a difference (RR 1.55; 95% CI 0.37 to 6.42; P = 0.55; low quality of evidence). There was no significant difference between the two arms for non-haematological toxicities like diarrhoea, stomatitis, infections, neurologic, respiratory, or cardiac adverse events. Grade III and IV thrombopenia occurred significantly more frequently in the GM-CSF arm (60.8%) compared to the antibiotics arm (28.9%); (RR 2.10; 95% CI 1.41 to 3.12; P = 0.0002; low quality of evidence). Neither infection-related mortality, incidence of febrile neutropenia, nor quality of life were reported in this trial. AUTHORS' CONCLUSIONS As we only found two small trials with 195 patients altogether, no conclusion for clinical practice is possible. More trials are necessary to assess the benefits and harms of G(M)-CSF compared to antibiotics for infection prevention in cancer patients receiving chemotherapy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pleural infection is a frequent clinical condition. Prompt treatment has been shown to reduce hospital costs, morbidity and mortality. Recent advances in treatment have been variably implemented in clinical practice. This statement reviews the latest developments and concepts to improve clinical management and stimulate further research. The European Association for Cardio-Thoracic Surgery (EACTS) Thoracic Domain and the EACTS Pleural Diseases Working Group established a team of thoracic surgeons to produce a comprehensive review of available scientific evidence with the aim to cover all aspects of surgical practice related to its treatment, in particular focusing on: surgical treatment of empyema in adults; surgical treatment of empyema in children; and surgical treatment of post-pneumonectomy empyema (PPE). In the management of Stage 1 empyema, prompt pleural space chest tube drainage is required. In patients with Stage 2 or 3 empyema who are fit enough to undergo an operative procedure, there is a demonstrated benefit of surgical debridement or decortication [possibly by video-assisted thoracoscopic surgery (VATS)] over tube thoracostomy alone in terms of treatment success and reduction in hospital stay. In children, a primary operative approach is an effective management strategy, associated with a lower mortality rate and a reduction of tube thoracostomy duration, length of antibiotic therapy, reintervention rate and hospital stay. Intrapleural fibrinolytic therapy is a reasonable alternative to primary operative management. Uncomplicated PPE [without bronchopleural fistula (BPF)] can be effectively managed with minimally invasive techniques, including fenestration, pleural space irrigation and VATS debridement. PPE associated with BPF can be effectively managed with individualized open surgical techniques, including direct repair, myoplastic and thoracoplastic techniques. Intrathoracic vacuum-assisted closure may be considered as an adjunct to the standard treatment. The current literature cements the role of VATS in the management of pleural empyema, even if the choice of surgical approach relies on the individual surgeon's preference.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intraspecific and interspecific architectural patterns were studied for eight tree species of a Bornean rain forest. Trees 5--19 m tall in two 4-ha permanent sample plots in primary forest were selected, and three light descriptors and seven architectural traits for each tree were measured. Two general predictions were made: (1) Slow growing individuals (or short ones) encounter lower light, and have flatter crowns, fewer leaf layers, and thinner stems, than do fast growing individuals (or tall ones). (2) Species with higher shade-tolerance receive less light and have flatter crowns, fewer leaf layers, and thinner stems, than do species with lower shade-tolerance. Shade-tolerance is assumed to decrease with maximum growth rate, mortality rate, and adult stature of a species.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pregnant BALB/c mice have been widely used as an in vivo model to study Neospora caninum infection biology and to provide proof-of-concept for assessments of drugs and vaccines against neosporosis. The fact that this model has been used with different isolates of variable virulence, varying infection routes and differing methods to prepare the parasites for infection, has rendered the comparison of results from different laboratories impossible. In most studies, mice were infected with similar number of parasites (2 × 10(6)) as employed in ruminant models (10(7) for cows and 10(6) for sheep), which seems inappropriate considering the enormous differences in the weight of these species. Thus, for achieving meaningful results in vaccination and drug efficacy experiments, a refinement and standardization of this experimental model is necessary. Thus, 2 × 10(6), 10(5), 10(4), 10(3) and 10(2) tachyzoites of the highly virulent and well-characterised Nc-Spain7 isolate were subcutaneously inoculated into mice at day 7 of pregnancy, and clinical outcome, vertical transmission, parasite burden and antibody responses were compared. Dams from all infected groups presented nervous signs and the percentage of surviving pups at day 30 postpartum was surprisingly low (24%) in mice infected with only 10(2) tachyzoites. Importantly, infection with 10(5) tachyzoites resulted in antibody levels, cerebral parasite burden in dams and 100% mortality rate in pups, which was identical to infection with 2 × 10(6) tachyzoites. Considering these results, it is reasonable to lower the challenge dose to 10(5) tachyzoites in further experiments when assessing drugs or vaccine candidates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Larval development time is a critical factor in assessing the potential for larval transport, mortality. and subsequently, the connectivity of marine populations through larval exchange. Most estimates of larval duration are based on laboratory studies and may not reflect development times in nature. For larvae of the American lobster (Homarus americanus), temperature-dependent development times have been established in previous laboratory studies. Here, we used the timing of seasonal abundance curves for newly hatched larvae (stage 1) and the final plankonic instar (postlarva), coupled with a model of temperature-dependent development to assess development time in the field. We were unable to reproduce the timing of the seasonal abundance curves using laboratory development rates in our model. Our results suggest that larval development in situ may be twice as fast as reported laboratory rates. This will result in reduced estimates of larval transport potential, and increased estimates of instantaneous mortality rate and production.