922 resultados para Type of study


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objectives of this study were to evaluate the effect of type of cuttings (apical, intermediate and basal) and different concentrations of IBA (indole-3-butyric acid) on rooting of Red success rose (Rosa sp.) leafy cuttings, during two season of cuttings collection (summer and winter). The investigation was carried out in the farm Irmaos Van Schaik in Holambra-SP, Brazil, from February to April and August to October/1993. The experimental design was a randomized block in a factorial arrangement. It consisted of 12 treatments (3 types of cuttings combined with IBA powder in 4 different concentrations - 0, 1000, 2000 and 4000 ppm) with 3 replicates during 2 seasons. The investigation permitted the following conclusions: the apical and intermediate cuttings showed better results in general when compared to the basal ones in both seasons; the average rooting at the transplanting time were 76%, 70% and 47% (summer) and 80%, 69% and 33% (winter) for apical, intermediate and basal cuttings, respectively; the utilization of IBA did not stimulate rooting. The average rooting for the control cuttings reached 85% (summer) and 78% (winter), regardless the type of cuttings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper by R. E. Catai, E. C. Bianchi, P. R de Águia and M. C. Alves reports on the results of an analysis made of roundness errors, residual stresses, and SEM micrographs of VC131 steel. The analysis involved workpieces ground with two types of cutting fluid: synthetic cutting fluid and emulsive oil. In this study, the cutting parameters were kept constant while the type of cutting fluid was varied. The amount of cutting fluid injected in the process was also varied, aiming to identify the ideal amount required to obtain good results without causing structural damage to the workpiece. The SEM analyses of roundness errors and residual stresses revealed that, of the two cutting fluids, emulsive oil provided better tensions due to its greater lubricating power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advancement of knowledge in neurophysiology has demonstrated that acupuncture is a method of peripheral neural stimulation that promotes local and systemic reflexive responses. The purpose of this study was to determine if surface electromyography can be used as a tool to study the action of auricular acupuncture on striated skeletal muscle. The electromyographic amplitudes of the anterior, middle and posterior deltoid muscle and the upper trapezium muscle with 20%, 40% and 60% of maximal voluntary contraction of 15 healthy volunteers, were analyzed after the individuals were submitted to the auricular acupuncture treatment. The non-parametric Friedman test was used to compare Root Mean Square values estimated by using a 200 ms moving window. Significant results were further analyzed using the Wilcoxon signed rank test. In this exploratory study, the level of significance of each comparison was set to p < 0.05. It was concluded in this study that a surface electromyography can be used as a tool to investigate possible alterations of electrical activity in muscles after auricular acupuncture. However there is still a lack of adequate methodology for its use in this type of study, being that the method used to record the electromyographic signal can also influence the results. © 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] The presence of a mosaic of habitats, largely determined by sea urchin grazing, across shallow rocky reefs may potentially influence in differences in the distribution patterns of invertebrates. The aim of this paper was to assess, using a correlative approach, whether the type of habitat influences the abundance patterns of holothurians in the eastern Atlantic. We hypothesized that abundances of large (> 10 cm) holothurians varied among four types of habitat (3 vegetated habitats with low abundances of the sea urchin D. antillarum versus ?barrens? with hyperabundances of sea urchins), and that these differences were consistent at a hierarchy of spatial scales, including two islands and several replicated sites within each type of habitat and island. Three species of large holothurians were found, accounting for a total of 300 specimens. We found remarkable differences in abundances of holothurians between the ?barrens? and the three vegetated habitats. This pattern was strongest for the numerically dominant species, Holothuria sanctorii. Total abundances of holothurians were between 5 ? 46 times more abundant in ?barrens? compared with the vegetated habitats. Inter-habitat differences were species-specific with some inconsistent patterns from one island to the other. The total abundances of holothurians tended to increase with the abundance of sea urchins within ?barrens?. Our study suggests that there may be a link, at least for the dominant species Holothuria sanctorii, between the distribution and abundances of large holothurians and the habitat across shallow-waters of the eastern Atlantic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose To compare changes in the largest cross-sectional area (CSA) of the median nerve in wrists undergoing surgical decompression with changes in wrists undergoing non-surgical treatment of carpal tunnel syndrome (CTS). Methods This study was a prospective cohort study in 55 consecutive patients with 78 wrists with established CTS, including 60 wrists treated with surgical decompression and 18 wrists with non-surgical treatment. A sonographic examination was scheduled before and 4 months after initiation of treatment. We compared changes in CSA of the median nerve between wrists with surgical treatment and wrists with non-surgical treatment using linear regression models. Results Decreases in CSA of the median nerve were more pronounced in wrists with CTS release than in wrists undergoing nonsurgical treatment (difference in means, 1.0 mm2; 95% confidence interval, 0.3–1.8 mm2). Results were robust to the adjustment for age, gender, and neurological severity at baseline. Among wrists with CTS release, those with postoperative CSA of 10 mm2 or less tended to have better clinical outcomes than those with postoperative CSA of greater than 10 mm2 (p=.055). Postoperative sonographic workup in the 3 patients with unfavorable outcome or recurrence identified likely causes for treatment failure in 2 patients. Conclusions In this observational study, surgical decompression was associated with a greater decrease in median nerve CSA than was nonsurgical treatment. Smaller postoperative CSAs may be associated with better clinical outcomes. Additional randomized trials are necessary to determine the optimal treatment strategy in different subgroups of patients with CTS. Type of study/level of evidence Therapeutic III.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unconscious perception is commonly described as a phenomenon that is not under intentional control and relies on automatic processes. We challenge this view by arguing that some automatic processes may indeed be under intentional control, which is implemented in task-sets that define how the task is to be performed. In consequence, those prime attributes that are relevant to the task will be most effective. To investigate this hypothesis, we used a paradigm which has been shown to yield reliable short-lived priming in tasks based on semantic classification of words. This type of study uses fast, well practised classification responses, whereby responses to targets are much less accurate if prime and target belong to a different category than if they belong to the same category. In three experiments, we investigated whether the intention to classify the same words with respect to different semantic categories had a differential effect on priming. The results suggest that this was indeed the case: Priming varied with the task in all experiments. However, although participants reported not seeing the primes, they were able to classify the primes better than chance using the classification task they had used before with the targets. When a lexical task was used for discrimination in experiment 4, masked primes could however not be discriminated. Also, priming was as pronounced when the primes were visible as when they were invisible. The pattern of results suggests that participants had intentional control on prime processing, even if they reported not seeing the primes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study was designed to investigate the influences of type of psychophysical task (two-alternative forced-choice [2AFC] and reminder tasks), type of interval (filled vs. empty), sensory modality (auditory vs. visual), and base duration (ranging from 100 through 1,000 ms) on performance on duration discrimination. All of these factors were systematically varied in an experiment comprising 192 participants. This approach allowed for obtaining information not only on the general (main) effect of each factor alone, but also on the functional interplay and mutual interactions of some or all of these factors combined. Temporal sensitivity was markedly higher for auditory than for visual intervals, as well as for the reminder relative to the 2AFC task. With regard to base duration, discrimination performance deteriorated with decreasing base durations for intervals below 400 ms, whereas longer intervals were not affected. No indication emerged that overall performance on duration discrimination was influenced by the type of interval, and only two significant interactions were apparent: Base Duration × Type of Interval and Base Duration × Sensory Modality. With filled intervals, the deteriorating effect of base duration was limited to very brief base durations, not exceeding 100 ms, whereas with empty intervals, temporal discriminability was also affected for the 200-ms base duration. Similarly, the performance decrement observed with visual relative to auditory intervals increased with decreasing base durations. These findings suggest that type of task, sensory modality, and base duration represent largely independent sources of variance for performance on duration discrimination that can be accounted for by distinct nontemporal mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND This study evaluated whether risk factors for sternal wound infections vary with the type of surgical procedure in cardiac operations. METHODS This was a university hospital surveillance study of 3,249 consecutive patients (28% women) from 2006 to 2010 (median age, 69 years [interquartile range, 60 to 76]; median additive European System for Cardiac Operative Risk Evaluation score, 5 [interquartile range, 3 to 8]) after (1) isolated coronary artery bypass grafting (CABG), (2) isolated valve repair or replacement, or (3) combined valve procedures and CABG. All other operations were excluded. Univariate and multivariate binary logistic regression were conducted to identify independent predictors for development of sternal wound infections. RESULTS We detected 122 sternal wound infections (3.8%) in 3,249 patients: 74 of 1,857 patients (4.0%) after CABG, 19 of 799 (2.4%) after valve operations, and 29 of 593 (4.9%) after combined procedures. In CABG patients, bilateral internal thoracic artery harvest, procedural duration exceeding 300 minutes, diabetes, obesity, chronic obstructive pulmonary disease, and female sex (model 1) were independent predictors for sternal wound infection. A second model (model 2), using the European System for Cardiac Operative Risk Evaluation, revealed bilateral internal thoracic artery harvest, diabetes, obesity, and the second and third quartiles of the European System for Cardiac Operative Risk Evaluation were independent predictors. In valve patients, model 1 showed only revision for bleeding as an independent predictor for sternal infection, and model 2 yielded both revision for bleeding and diabetes. For combined valve and CABG operations, both regression models demonstrated revision for bleeding and duration of operation exceeding 300 minutes were independent predictors for sternal infection. CONCLUSIONS Risk factors for sternal wound infections after cardiac operations vary with the type of surgical procedure. In patients undergoing valve operations or combined operations, procedure-related risk factors (revision for bleeding, duration of operation) independently predict infection. In patients undergoing CABG, not only procedure-related risk factors but also bilateral internal thoracic artery harvest and patient characteristics (diabetes, chronic obstructive pulmonary disease, obesity, female sex) are predictive of sternal wound infection. Preventive interventions may be justified according to the type of operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to determine, for penetrating injuries (gunshot, stab) of the chest/abdomen, the impact on fatality of treatment in trauma centers and shock trauma units compared with general hospitals. Medical records of all cases of penetrating injury limited to chest/abdomen and admitted to and discharged from 7 study facilities in Baltimore city 1979-1980 (n = 581) were studied: 4 general hospitals (n = 241), 2 area-wide trauma centers (n = 298), and a shock trauma unit (n = 42). Emergency center and transferred cases were not studied. Anatomical injury severity, measured by modified Injury Severity Score (mISS), was a significant prognostic factor for death, as were cardiovascular shock (SBP $\le$ 70), injury type (gunshot vs stab), and ambulance/helicopter (vs other) transport. All deaths occurred in cases with two or more prognostic factors. Unadjusted relative risks of death compared with general hospitals were 4.3 (95% confidence interval = 2.2, 8.4) for shock trauma and 0.8 (0.4, 1.7) for trauma centers. Controlling for prognostic factors by logistic regression resulted in these relative risks: shock trauma 4.0 (0.7, 22.2), and trauma centers 0.8 (0.2, 3.2). Factors significantly associated with increased risk had the following relative risks by multiple logistic regression: SBP $\le$ 70 (RR = 40.7 (11.0, 148.7)), highest mISS (42 (7.7, 227)), gunshot (8.4 (2.1, 32.6)), and ambulance/helicopter transport (17.2 (1.3, 228.1)). Controlling for age, race, and gender did not alter results significantly. Actual deaths compared with deaths predicted from a multivariable model of general-hospital cases showed 3.7 more than predicted deaths in shock trauma (SMR = 1.6 (0.8, 2.9)) and 0.7 more than predicted deaths in area-wide trauma centers (SMR = 1.05 (0.6, 1.7)). Selection bias due to exclusion of transfers and emergency center cases, and residual confounding due to insufficient injury information, may account for persistence of adjusted high case fatality in shock trauma. Studying all cases prospectively, including emergency center and transferred cases, is needed. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state of knowledge on the relation of stress factors, health problems and health service utilization among university students is limited. Special problems of stress exist for the international students due to their having to adjust to a new environment. It is this latter problem area that provides the focus for this study. Recognizing there are special stress factors affecting the international students, it is first necessary to see if the problems of cultural adaptation affect them to any greater degree than American students attending the same university.^ To make the comparison, the study identified a number of health problems of both American and international students and related their frequency to the use of the Student Health Center. The expectation was that there would be an association between the number of health problems and the number of life change events experienced by these students and between the number of health problems and stresses from social factors. It was also expected that the number of health problems would decline with the amount of social support.^ The population chosen were students newly enrolled in Texas Southern University, Houston, Texas in the Fall Semester of 1979. Two groups were selected at random: 126 international and 126 American students. The survey instrument was a self-administered questionnaire. The response rate was 90% (114) for the international and 94% (118) for the American students.^ Data analyses consisted of both descriptive and inferential statistics. Chi-squares and correlation coefficients were the statistics used in comparing the international students and the American students.^ There was a weak association between the number of health problems and the number of life change events, as reported by both the international and the American students. The study failed to show any statistically significant association between the number of stress from social factors and the number of health problems. It also failed to show an association between the number of health problems and the amount of social support. These findings applied to both the international and the American students.^ One unexpected finding was that certain health problems were reported by more American than international students. There were: cough, diarrhea, and trouble in sleeping. Another finding was that those students with health insurance had a higher level of utilization of the Health Center than those without health insurance. More international than American students utilized the Student Health Center.^ In comparing the women students, there was no statistical significant difference in their reported fertility related health problems.^ The investigator recommends that in follow-up studies, instead of grouping all international students together, that they be divided by major nationalities represented in the student body; that is, Iranians, Nigerians and others. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los objetivos globales de esta tesis han sido estudiar el efecto que los carbohidratos de la dieta ejercen sobre los rendimientos productivos, la barrera intestinal, y la digestión de animales destetados a 25 días de edad. Además se ha estudiado cuál es el mejor periodo para determinar la digestibilidad fecal tras el destete a esta edad. En el primer experimento se estudió el efecto de la fibra neutro detergente soluble (FNDS) sobre la barrera intestinal, digestión, microbiota intestinal y rendimientos productivos de gazapos en gazapos en la fase post-destete. Se diseñaron tres piensos isonutritivos en los que la única fuente de variación fueron los niveles de fibra soluble. A partir de una dieta control (AH) con 103 g/kg de materia seca de FNDS y alfalfa como fuente principal de fibra, se sustituyó la mitad de esta alfalfa por una mezcla de pulpa de remolacha y pulpa de manzana (75:25) en el pienso B-AP y por una mezcla de cascarilla y concentrado de proteína de soja (88:12) en el pienso OH, obteniéndose 131 y 79 g/kg de FNDS sobre materia seca, respectivamente. Los conejos se destetaron a 25 días y fueron alimentados con los piensos experimentales hasta los 35 días de edad, momento en el que se sacrificaron para la determinación de la digestibilidad ileal aparente (DIA) de la materia seca (MS), proteína bruta (PB) y almidón, la morfología de la mucosa, y actividad enzimática en el yeyuno, el tejido linfoide asociado a la mucosa, así como la microbiota intestinal. Para la determinación de la morfología de la mucosa se utilizaron adicionalmente 19 animales lactantes de 35 días de edad. Para el estudio de la tasa de mortalidad, se utilizaron 118 animales más por tratamiento que recibieron los piensos experimentales durante las dos semanas post-destete y posteriormente un pienso comercial hasta los 60 días de edad. Los animales recibieron durante todo el experimento medicación en el agua de bebida (100 ppm de apramicina sulfato y 120 ppm de tilosina tartrato). El nivel de fibra soluble mejoró los parámetros que se utilizaron para la caracterización del estado de la barrera intestinal. Los conejos alimentados con el mayor nivel de FNDS en el pienso presentaron una mayor longitud de los villi (P=0.001), un mayor ratio longitud villi/profundidad de las criptas (8.14; P=0.001), una mayor actividad disacaridásica (8671 μmol de glucosa/g de proteína; P=0.019), así como una mayor digestibilidad ileal (96.8%; P=0.002), observándose una reducción en el flujo ileal de almidón a medida que se incrementó el nivel de fibra soluble en el pienso (1,2 vs 0,5 g/d; P=0.001). Los animales lactantes a 35 días de edad presentaron un ratio longitud de villi/profundidad de las criptas menor que el observado en aquéllos alimentados con el pienso B-AP (6.70), pero superior al de los piensos AH y OH. Niveles inferiores de NDFS tendieron (P=0.074) a incrementar la respuesta inmune de tipo celular (linfocitos CD8+). El pienso también afectó a la producción de IL2 (CD25+; P=0.029; CD5+CD25+; P=0.057), pero sin llegar a establecerse una clara relación con el nivel de fibra soluble. La diversidad de la microbiota intestinal no se vio afectada por el pienso (P ≥ 0.38). Los animales alimentados con las piensos B-AP y AH presentaron una reducción en la frecuencia de detección de Clostridium perfringens tanto en íleon (P=0.062) como en ciego (4.3 vs. 17.6%, P =0.047), comparado con el pienso OH. Además la tasa de mortalidad (118 gazapos/pienso) disminuyó de 14.4% en el pienso OH a 5.1% en el pienso B-AP. Entre los 32 y los 35 días de edad se determinó la digestibilidad fecal aparente (14/pienso) de la materia seca (MS), energía bruta (EB), proteína bruta (PB), fibra neutro detergente (FND), fibra ácido detergente (FAD) y almidón. Este grupo, junto con otros nueve animales por tratamiento se utilizaron para determinar el peso del estómago y el ciego, la concentración cecal de ácidos grasos volátiles (AGV) y amoniaco (NH3), así como las tasas de similitud de la microbiota intestinal. Además se estudiaron los rendimientos productivos (35 animales/tratamiento) de los gazapos durante todo el período de cebo, consumiendo los piensos experimentales desde el destete hasta los 35 días y posteriormente un pienso comercial hasta los 60 días de edad. Niveles crecientes de FNDS mejoraron la digestibilidad fecal de la materia seca (MS) y energía (P<0.001). La inclusión FNDS aumentó de manera lineal el peso del contenido cecal (P=0.001) y el peso del aparato digestivo completo (P=0.008), y en los días previos al sacrificio disminuyó de manera lineal el consumo medio diario (P=0.040). Se observó además, una disminución lineal (P≤0.041) del pH del estómago. No se encontró relación entre el pH, la concentración y proporciones molares de AGV y el nivel de FNDS. El pienso pareció tener un efecto, incluso superior al de la madre, sobre la tasa de similitud de la microbiota, y los efectos fueron mayores a nivel cecal que ileal. La eficacia alimenticia aumentó de manera lineal en un 12% entre piensos extremos tras el destete (25- 39d) y en un 3% en el período global de cebo con niveles mayores de NDFS. El consumo medio diario durante la fase post-destete y durante todo el período de cebo, tendió a aumen tar (P≤0.079) con niveles mayores de FNDS, sin embargo no se apreció efecto sobre la ganancia media diaria (P≥0.15). En conclusión, el incremento del nivel de fibra soluble en el pienso parece resultar beneficioso para la salud del animal ya que mejora la integridad de la mucosa, y reduce la frecuencia de detección de potenciales patógenos como C. perfringens y Campylobacter spp. Conforme a estos resultados, debería tenerse en cuenta el contenido en fibra soluble en la formulación de piensos de conejos en la fase post-destete. El objetivo del segundo experimento fue determinar el efecto de la fuente de almidón sobre la digestión, la microbiota intestinal y los rendimientos productivos en conejos destetados con 25 días de edad. Se formularon tres piensos isonutritivos en los que se modificaron las principales fuentes de almidón: trigo crudo, trigo cocido y una combinación de trigo y arroz cocido. Dos grupos de 99 y 193 animales se destetaron con 25 días de edad. El primero de ellos se utilizó para la determinación de los parámetros productivos conforme al mismo protocolo seguido en el experimento anterior. El segundo de los grupos se utilizó para la determinación de la digestibilidad fecal de 32 a 35 d, la digestibilidad ileal aparente (DIA) a 35 d, la morfología de la mucosa intestinal, los parámetros de fermentación cecal; así como, la caracterización de la microbiota intestinal. Se utilizaron además dos grupos adicionales de animales 384 (medicados) y 177 (no medicados) para estudiar el efecto de la suplementación con antibióticos en el agua de bebida sobre la mortalidad. El procesado térmico del trigo mejoró ligeramente la digestibilidad ileal del almidón (P=0.020) pero no modificó el flujo final de almidón que alcanzó el ciego, observándose una mayor frecuencia de detección de Campylobacter spp. y Ruminococcus spp. en ciego (P≤0.023), pero sin cambios a nivel ileal. El procesado térmico del trigo no afectó tampoco a los parámetros productivos, la mortalidad, la digestibilidad ileal y fecal o la morfología de la mucosa. La sustitución parcial del trigo cocido por arroz cocido, penalizó la digestibilidad ileal del almidón (P=0.020) e incrementó el flujo ileal de este nutriente al ciego (P=0.007). Sin embargo no afectó a la mortalidad, pese a que se detectaron cambios en la microbiota tanto a nivel ileal como cecal, disminuyendo la frecuencia de detección de Campylobacter spp. (en íleon y ciego), Helicobacter spp. (en íleon) y Ruminococcus spp (en ciego) e incrementando Bacteroides spp. (en ciego) (P≤0.046). El empleo de arroz cocido en las piensos post-destete no tuvieron efectos sobre los parámetros productivos, la mortalidad, la digestibilidad ileal y fecal a excepción del almidón, o la morfología de la mucosa. La suplementación con antibiótico redujo la fre cuencia de detección de la mayoría de las bacterias estudiadas (P≤0.048), sobre todo para Campylobacter spp., Clostridium perfringens y Propionibacterium spp. (P≤0.048), observándose un efecto mayor a nivel ileal que cecal, lo que se asoció a la bajada significativa (P<0.001) de la mortalidad. En conclusión, los resultados de este experimento indican que la fuente de almidón afecta a la microbiota intestinal pero no influiye sobre la salud del animal. En relación al procesado, el uso de trigo cocido junto con arroz cocido no mejora los resultados obtenidos con trigo duro, si bienserían necesarios más experimentos que confirmaran este punto. El último de los experimentos se centró en un aspecto metodológico. Dado que, los conejos destetados presentan un patrón digestivo diferente al de un animal adulto resultado de su inmadurez digestiva, el objetivo buscado era tratar de determinar el mejor procedimiento para la determinación de la digestibilidad fecal en los gazapos en la fase post-destete. Para tal fin se utilizaron 15 animales/tratamiento de tres camadas diferentes que se destetaron con 25 días, suministrándoles un pienso comercial de crecimiento-cebo. Se registró el consumo medio diario y la excreción diaria de heces desde el día 25 hasta el día 40 de edad para la determinación de la digestibilidad de la MS. La camada afectó al consumo medio diario y la excreción de heces (P=0.013 y 0.014, respectivamente), observándose una tendencia (P=0.061) en la digestibilidad. La edad afectó (P<0.001) a todos estos factores, incrementándose de manera más evidente la excreción que la ingestión de materia seca en la primera semana de vida, para aumentar de forma paralela a partir de la segunda. La correlación entre el consumo medio diario fue mayor con la excreción de heces del mismo día que con la del día siguiente, por lo que se utilizó el primero para la determinación de la digestibilidad de la MS (MSd). La MSd disminuyó de manera lineal hasta los 32 días de edad (2.17±0.25 unidades porcentuales por día), mientras que permaneció constante desde los 32 a los 40 días (69.4±0.47%). Por otro lado, la desviación estándar de la MSd se redujo cuando se incrementó el período de recogida de 2 a 6 días en un 54%. Conforme a los resultados obtenidos, se puede concluir que no es aconsejable comenzar las pruebas de digestibilidad antes de los 32 días de edad y que el número de animales necesario para detectar diferencias significativas entre tratamientos dependerá del período de recogida de heces. ABSTRACT The global aim of this thesis has been to study the effect of dietary carbohydrates on growth, performance, digestion and intestinal barrier in 25-d weaned rabbits. In addition there has also been studied which is the best period to determine the fecal digestibility after weaning. The first experiment focused on the effect of Neutral Detergent Soluble Fibre (NDSF) on gut barrier function, digestion, intestinal microbiota and growth performance n rabbits in the post-weaning period. Three isonutritive diets which only varied in the levels of soluble fiber were formulated such as it described as follows: a control diet (AH) containing 103 g of neutral detergent soluble fiber, including alfalfa as main source of fiber, was replaced by a mixture of beet and apple pulp (75-25) in the B-AP diet and, by a mix of oat hulls and soybean protein concentrate (88:12) in the OH diet, resulting 131 and 79 g of NDFS/kg of dry matter, respectively. Rabbits, weaned at 25 days of age, were fed the experimental diets up to 35 days of age, moment in which they were slaughtered for apparent ileal digestibility (AID) of dry matter (DM), crude protein (CP) and starch, mucosa morphology, sucrose activity, characterization of lamina propria lymphocytes and intestinal microbiota. To assess mucosal morphology, 19 suckling 35-d-old rabbits were also used. For mortality study, besides these animals, 118 additional rabbits per treatment were fed the experimental diets for two weeks period and thereafter received a commercial diet until 60 days of age. Rabbits were water medicated during the whole experimental period (100 ppm de apramicine sulphate and 120 ppm of tylosine tartrate). Level of soluble fiber improved all the parameters used for the characterization of the intestinal barrier condition. Villous height of the jejunal mucosa increased with dietary soluble fiber (P=0.001). Villous height of jejunal mucosa increased with dietary soluble fiber (P = 0.001). Rabbits fed the highest level of soluble fiber (BA-P diet) showed the highest villous height/crypth depth ratio (8.14; P = 0.001), sucrase specific activity (8671 μmol glucose/ g protein; P = 0.019), and the greatest ileal starch digestibility (96.8%; P = 0.002). The opposite effects were observed in rabbits fed decreased levels of soluble fiber (AH and OH diets; 4.70, 5,848 μmol of glucose/g of protein, as average, respectively). The lowest ileal starch digestibility was detected for animal fed OH diet (93.2%). Suckling rabbits of the same age showed a lower villous height/crypt depth ratio (6.70) compared with the B-AP diet group, but this ration was higher that the AH or OH diet groups. Lower levels of soluble fiber tended (P = 0.074) to increase the cellular immune response (CD8+ lymphocytes). Diet affected IL-2 production (CD25+, P = 0.029; CD5+CD25+, P = 0.057), with no clear relationship between soluble fiber and IL-2. The intestinal microbiota biodiversity was not affected by diets (P ≥ 0.38). Animals fed B-AP and AH diets had a reduced cecal frequency of detection compatible with Campylobacter spp. (20.3 vs. 37.8, P = 0.074), and Clostridium perfringens (4.3 vs. 17.6%, P = 0.047), compared with the OH diet group. Moreover, the mortality rates decreased from 14.4 (OH diet) to 5.1% (B-AP diet) with the increased presence of soluble fiber in the diet. Between 32 and 35 days of age, faecal apparent digestibility of dry matter (DM), gross energy (GE), crude protein (CP), neutral detergent fiber (NDF), acid detergent fiber (ADF) and starch was determined (14/diet). This group, plus another nine rabbits/diet were used to determine weight of stomach and caecum and their contents, cecal fermentation traits and similarity rate (SR) of intestinal microbiota. Besides, growth performance parameters (35 rabbits/diet) were studied during the whole fattening period, in animals consuming the experimental feed after the weaning up to 35 days of age and later on a commercial diet up animals reached 60 days of age. Increasing levels of Neutral Detergent Soluble Fiber improved faecal dry matter and energy digestibility (P<0.001). NDSF inclusion improved linearly weight of the caecal content (P=0.001) and the total gastrointestinal tract (P=0.008), and in the previous days to slaughter a linear decrease of daily feed intake in diet with highest level of soluble fiber was also observed. Stomach pH decreased linearly with increasing levels of NDFS (P≤0.041). No relation between NDSF level on pH, concentration and molar proportion of VFA was found. Treatments appeared to influence the similarity rate of microbiota, even higher to mother effect. These effects were higher in ileum than in caecum. A linear positive effect of feed efficiency was observed, which increased around 12% in the two weeks post-weaning (25-39d) and 3% in the whole fattening period between extreme diets with highest levels of soluble fiber. Average daily feed intake during the two weeks after weaning and in the whole fattening period, tended (P≤0.079) to increase with highest levels of NDSF; although there were no effect on daily weight gain (≥0.15). In conclusion, an increase of soluble fiber in the feed seems to be beneficial for animal health, due to improve mucose integrity and reduce detection frequency of those poten tial pathogens like C. perfringens and Campylobacter spp. According to these results, level of soluble fiber should be taking care in feed rabbit formulation in the post-weaning period. The objective of the second experiment was to determine the effect of source of starch on digestion, intestinal microbiota and growth performance in twenty-five-day old weaned rabbits. To accomplish with this aim three iso-nutritive diets were formulated with different source of starch: raw wheat, boiled wheat and a combination of boiled wheat and boiled rice. Two groups of 99 and 193 rabbits were weaned at 25 days of age. The first group was used for growth performance determination following the same protocol than in previous experiment. The second group was used to determine faecal digestibility from 32 to 35 d, apparent ileal digestibility (AID) at 35 d, jejunal mucosa morphology, caecal fermentation traits and characterization of intestinal microbiota. For mortality, two additional groups of 384 (medicated) and 177 (not medicated) were used in order to study the effect of antibiotic water supply supplementation. Heat processing of starch slightly improved ileal digestibility of starch (P=0.020) but did not modify the flow of starch to the caecum. An increase in frequency of detection of Campylobacter spp. y Ruminococcus spp. was observed in the caecum (P≤0.023), with no changes at ileal level. Heat processing of wheat did not modify growth performance, mortality, ileal or faecal digestibility and mucosa morphology. Partial substitution of boiled wheat for boiled rice in the diet impaired ileal starch digestibility (P=0.020) and increased the ileal flow of this nutrient to the caecum (P=0.007). However, it did not affect mortality rate, although changes in the ileal and caecal intestinal microbiota were detected, decreasing the frequency of detection of Campylobacter spp. (both ileum and caecum), Helicobacter spp. (at ileum) and Ruminococcus spp (at caecum) and increasing the Bacteroides spp. (at caecum) (P≤0.046). The effect of boiled rice supplementation did not alter growth performance, mortality, ileal or faecal digestibility of other nutrients than starch, and mucosa morphology. Medication of rabbits reduced the ileal frequency of detection of most bacteria studied (P≤0.048), especially for Campylobacter spp., Clostridium perfringens y Propionibacterium spp. (P≤0.048), resulting the effect higher at ileal than caecal level and relating it with a strong reduction of mortality rate (P<0.001). In conclusion, the results of this experiment make think that the source of starch affects the intestinal microbiota but they do not seem to influence animal health. In relation to the effect of heat processed the use of cooked wheat or cooked rice it does not seem to im prove the results obtained with hard wheat, but there would be necessary more experiments that were confirming this point. The last experiment focused on a methodological aspect. Considering that, weaned rabbits have a different digestive pattern than older animals due to their digestive immaturity; the fixed objective was to determine the best procedure for faecal digestibility determination in young rabbits in the post-weaning period. Fifteen rabbits from 5 different litters were weaned at 25 days of age and fed with a commercial feed. Feed intake and faeces excretion were recorded daily from 25 to 40 days of age for dry matter digestibility (DMd) determination. Litter affected daily DM intake and excretion (P=0.013 y 0.014, respectively) and tended to affect DMd (P=0.061). Age affected all these factors (P<0.001), but ingestion increased slowly than dry matter excretion during the first week buth they evolved similarly in the second week. The correlation between daily feed intakes was higher with the faeces excretion of the day than with faeces excretion of the next day, and the first values were used to determine daily DMd. The DMd decreased linearly from weaning to 32 d of age (2.17±0.25 percentage units per day), whereas from 32 to 40 d remained constant (69.4±0.47%). On the other hand, average standard deviation of DMd decreased by 54% when the length of collection period increased from 2 to 6d. Consequently to the obtained results, it could be concluded that it would not be advisable to start digestibility trials before the 32 days of age and that the number of animals required to detect a significant difference among means would depend on the collection period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to determine whether there are different types of leadership for men and women and within different job roles (Manager, Technical Director and Coordinator).