24 resultados para ArgentinaÆs farm row
Resumo:
RATIONALE AND OBJECTIVES: To evaluate the effect of automatic tube current modulation on radiation dose and image quality for low tube voltage computed tomography (CT) angiography. MATERIALS AND METHODS: An anthropomorphic phantom was scanned with a 64-section CT scanner using following tube voltages: 140 kVp (Protocol A), 120 kVp (Protocol B), 100 kVp (Protocol C), and 80 kVp (Protocol D). To achieve similar noise, combined z-axis and xy-axes automatic tube current modulation was applied. Effective dose (ED) for the four tube voltages was assessed. Three plastic vials filled with different concentrations of iodinated solution were placed on the phantom's abdomen to obtain attenuation measurements. The signal-to-noise ratio (SNR) was calculated and a figure of merit (FOM) for each iodinated solution was computed as SNR(2)/ED. RESULTS: The ED was kept similar for the four different tube voltages: (A) 5.4 mSv +/- 0.3, (B) 4.1 mSv +/- 0.6, (C) 3.9 mSv +/- 0.5, and (D) 4.2 mSv +/- 0.3 (P > .05). As the tube voltage decreased from 140 to 80 kVp, image noise was maintained (range, 13.8-14.9 HU) (P > .05). SNR increased as the tube voltage decreased, with an overall gain of 119% for the 80-kVp compared to the 140-kVp protocol (P < .05). The FOM results indicated that with a reduction of the tube voltage from 140 to 120, 100, and 80 kVp, at constant SNR, ED was reduced by a factor of 2.1, 3.3, and 5.1, respectively, (P < .001). CONCLUSIONS: As tube voltage decreases, automatic tube current modulation for CT angiography yields either a significant increase in image quality at constant radiation dose or a significant decrease in radiation dose at a constant image quality.
Resumo:
PURPOSE: To determine if multi–detector row computed tomography (CT) can replace conventional radiography and be performed alone in severe trauma patients for the depiction of thoracolumbar spine fractures. MATERIALS AND METHODS: One hundred consecutive severe trauma patients who underwent conventional radiography of the thoracolumbar spine as well as thoracoabdominal multi–detector row CT were prospectively identified. Conventional radiographs were reviewed independently by three radiologists and two orthopedic surgeons; CT images were reviewed by three radiologists. Reviewers were blinded both to one another’s reviews and to the results of initial evaluation. Presence, location, and stability of fractures, as well as quality of reviewed images, were assessed. Statistical analysis was performed to determine sensitivity and interobserver agreement for each procedure, with results of clinical and radiologic follow-up as the standard of reference. The time to perform each examination and the radiation dose involved were evaluated. A resource cost analysis was performed. RESULTS: Sixty-seven fractured vertebrae were diagnosed in 26 patients. Twelve patients had unstable spine fractures. Mean sensitivity and interobserver agreement, respectively, for detection of unstable fractures were 97.2% and 0.951 for multi–detector row CT and 33.3% and 0.368 for conventional radiography. The median times to perform a conventional radiographic and a multi–detector row CT examination, respectively, were 33 and 40 minutes. Effective radiation doses at conventional radiography of the spine and thoracoabdominal multi–detector row CT, respectively, were 6.36 mSv and 19.42 mSv. Multi–detector row CT enabled identification of 146 associated traumatic lesions. The costs of conventional radiography and multi–detector row CT, respectively, were $145 and $880 per patient. CONCLUSION: Multi–detector row CT is a better examination for depicting spine fractures than conventional radiography. It can replace conventional radiography and be performed alone in patients who have sustained severe trauma.
Resumo:
Recently, the French National Institute for Agricultural Research appointed an expert committee to review the issue of pain in food-producing farm animals. To minimise pain, the authors developed a '3S' approach accounting for 'Suppress, Substitute and Soothe' by analogy with the '3Rs' approach of 'Reduction, Refinement and Replacement' applied in the context of animal experimentation. Thus, when addressing the matter of pain, the following steps and solutions could be assessed, in the light of their feasibility (technical constraints, logistics and regulations), acceptability (societal and financial aspects) and availability. The first solution is to suppress any source of pain that brings no obvious advantage to the animals or the producers, as well as sources of pain for which potential benefits are largely exceeded by the negative effects. For instance, tail docking of cattle has recently been eliminated. Genetic selection on the basis of resistance criteria (as e.g. for lameness in cattle and poultry) or reduction of undesirable traits (e.g. boar taint in pigs) may also reduce painful conditions or procedures. The second solution is to substitute a technique causing pain by another less-painful method. For example, if dehorning cattle is unavoidable, it is preferable to perform it at a very young age, cauterising the horn bud. Animal management and constraint systems should be designed to reduce the risk for injury and bruising. Lastly, in situations where pain is known to be present, because of animal management procedures such as dehorning or castration, or because of pathology, for example lameness, systemic or local pharmacological treatments should be used to soothe pain. These treatments should take into account the duration of pain, which, in the case of some management procedures or diseases, may persist for longer periods. The administration of pain medication may require the intervention of veterinarians, but exemptions exist where breeders are allowed to use local anaesthesia (e.g. castration and dehorning in Switzerland). Extension of such exemptions, national or European legislation on pain management, or the introduction of animal welfare codes by retailers into their meat products may help further developments. In addition, veterinarians and farmers should be given the necessary tools and information to take into account animal pain in their management decisions.
Resumo:
The aim was to study the variation in metabolic responses in early-lactating dairy cows (n = 232) on-farm that were pre-selected for a high milk fat content (>45 g/l) and a high fat/protein ratio in milk (>1.5) in their previous lactation. Blood was assayed for concentrations of metabolites and hormones. Liver was measured for mRNA abundance of 25 candidate genes encoding enzymes and receptors involved in gluconeogenesis (6), fatty acid β-oxidation (6), fatty acid and triglyceride synthesis (5), cholesterol synthesis (4), ketogenesis (2) and the urea cycle (2). Two groups of cows were formed based on the plasma concentrations of glucose, non-esterified fatty acids (NEFA) and β-hydroxybutyric acid (BHBA) (GRP+, high metabolic load; glucose <3.0 mm, NEFA >300 μm and BHBA >1.0 mm, n = 30; GRP-, low metabolic load; glucose >3.0 mm, NEFA <300 μm and BHBA <1.0 mm, n = 30). No differences were found between GRP+ and GRP- for the milk yield at 3 weeks post-partum, but milk fat content was higher (p < 0.01) for GRP+ than for GRP-. In week 8 post-partum, milk yield was higher in GRP+ in relation to GRP- (37.5 vs. 32.5 kg/d; p < 0.01). GRP+ in relation to GRP- had higher (p < 0.001) NEFA and BHBA and lower glucose, insulin, IGF-I, T3 , T4 concentrations (p < 0.01). The mRNA abundance of genes related to gluconeogenesis, fatty acid β-oxidation, fatty acid and triglyceride synthesis, cholesterol synthesis and the urea cycle was different in GRP+ compared to GRP- (p < 0.05), although gene transcripts related to ketogenesis were similar between GRP+ and GRP-. In conclusion, high metabolic load post-partum in dairy cows on-farm corresponds to differences in the liver in relation to dairy cows with low metabolic load, even though all cows were pre-selected for a high milk fat content and fat/protein ratio in milk in their previous lactation.
Resumo:
Cocoa production in Alto Beni, Bolivia, is a major source of income and is severely affected by climate change impacts and other stress factors. Resilient farming systems are, thus, important for local families. This study compares indicators for social–ecological resilience in 30 organic and 22 nonorganic cocoa farms of Alto Beni. Organic farms had a higher tree and crop diversity, higher yields and incomes, more social connectedness, and participated in more courses on cocoa cultivation. Resilience was enhanced by local farmers’ organizations, providing organic certification and supporting diversified agroforestry with seedlings and extension, going beyond basic organic certification requirements.
Resumo:
We present the first study comparing epitheliocystis in a wild and farmed salmonid in Europe. Sampling three tributaries to the Lake Geneva, including one from headwaters to river mouth, revealed an unequal distribution of epitheliocystis in brown trout (Salmo trutta). When evaluated histologically and comparing sites grouped as wild versus farm, the probability of finding infected trout is higher on farms. In contrast, the infection intensities, as estimated by the number of cysts per gill arch, were higher on average and showed maximum values in the wild trout. Sequence analysis showed the most common epitheliocystis agents were Candidatus Piscichlamydia salmonis, all clustering into a single clade, whereas Candidatus Clavichlamydia salmonicola sequences cluster in two closely related sub-species, of which one was mostly found in farmed fish and the other exclusively in wild brown trout, indicating that farms are unlikely to be the source of infections in wild trout. A detailed morphological analysis of cysts using transmission electron microscopy revealed unique features illustrating the wide divergence existing between Ca. P. salmonis and Ca. C. salmonicola within the phylum Chlamydiae
Resumo:
Biosecurity is crucial for safeguarding livestock from infectious diseases. Despite the plethora of biosecurity recommendations, published scientific evidence on the effectiveness of individual biosecurity measures is limited. The objective of this study was to assess the perception of Swiss experts about the effectiveness and importance of individual on-farm biosecurity measures for cattle and swine farms (31 and 30 measures, respectively). Using a modified Delphi method, 16 Swiss livestock disease specialists (8 for each species) were interviewed. The experts were asked to rank biosecurity measures that were written on cards, by allocating a score from 0 (lowest) to 5 (highest). Experts ranked biosecurity measures based on their importance related to Swiss legislation, feasibility, as well as the effort required for implementation and the benefit of each biosecurity measure. The experts also ranked biosecurity measures based on their effectiveness in preventing an infectious agent from entering and spreading on a farm, solely based on transmission characteristics of specific pathogens. The pathogens considered by cattle experts were those causing Bluetongue (BT), Bovine Viral Diarrhea (BVD), Foot and Mouth Disease (FMD) and Infectious Bovine Rhinotracheitis (IBR). Swine experts expressed their opinion on the pathogens causing African Swine Fever (ASF), Enzootic Pneumonia (EP), Porcine Reproductive and Respiratory Syndrome (PRRS), as well as FMD. For cattle farms, biosecurity measures that improve disease awareness of farmers were ranked as both most important and most effective. For swine farms, the most important and effective measures identified were those related to animal movements. Among all single measures evaluated, education of farmers was perceived by the experts to be the most important and effective for protecting both Swiss cattle and swine farms from disease. The findings of this study provide an important basis for recommendation to farmers and policy makers.
Resumo:
BACKGROUND: This study focused on the descriptive analysis of cattle movements and farm-level parameters derived from cattle movements, which are considered to be generically suitable for risk-based surveillance systems in Switzerland for diseases where animal movements constitute an important risk pathway. METHODS: A framework was developed to select farms for surveillance based on a risk score summarizing 5 parameters. The proposed framework was validated using data from the bovine viral diarrhoea (BVD) surveillance programme in 2013. RESULTS: A cumulative score was calculated per farm, including the following parameters; the maximum monthly ingoing contact chain (in 2012), the average number of animals per incoming movement, use of mixed alpine pastures and the number of weeks in 2012 a farm had movements registered. The final score for the farm depended on the distribution of the parameters. Different cut offs; 50, 90, 95 and 99%, were explored. The final scores ranged between 0 and 5. Validation of the scores against results from the BVD surveillance programme 2013 gave promising results for setting the cut off for each of the five selected farm level criteria at the 50th percentile. Restricting testing to farms with a score ≥ 2 would have resulted in the same number of detected BVD positive farms as testing all farms, i.e., the outcome of the 2013 surveillance programme could have been reached with a smaller survey. CONCLUSIONS: The seasonality and time dependency of the activity of single farms in the networks requires a careful assessment of the actual time period included to determine farm level criteria. However, selecting farms in the sample for risk-based surveillance can be optimized with the proposed scoring system. The system was validated using data from the BVD eradication program. The proposed method is a promising framework for the selection of farms according to the risk of infection based on animal movements.