823 resultados para Nutrition--In adolescence
Resumo:
Objective: To examine the association of breakfast consumption with objectively measured and self-reported physical activity, sedentary time and physical fitness. Design: The HELENA (Healthy Lifestyle in Europe by Nutrition in Adolescence) Cross-Sectional Study. Breakfast consumption was assessed by two non-consecutive 24 h recalls and by a ‘Food Choices and Preferences’ questionnaire. Physical activity, sedentary time and physical fitness components (cardiorespiratory fitness, muscular fitness and speed/agility) were measured and self-reported. Socio-economic status was assessed by questionnaire. Setting: Ten European cities. Subjects: Adolescents (n 2148; aged 12?5–17?5 years). Results: Breakfast consumption was not associated with measured or self-reported physical activity. However, 24 h recall breakfast consumption was related to measured sedentary time in males and females; although results were not confirmed when using other methods to assess breakfast patterns or sedentary time. Breakfast consumption was not related to muscular fitness and speed/agility in males and females. However, male breakfast consumers had higher cardiorespiratory fitness compared with occasional breakfast consumers and breakfast skippers, while no differences were observed in females. Overall, results were consistent using different methods to assess breakfast consumption or cardiorespiratory fitness (all P#0?005). In addition, both male and female breakfast skippers (assessed by 24 h recall) were less likely to have high measured cardiorespiratory fitness compared with breakfast consumers (OR50?33; 95% CI 0?18, 0?59 and OR50?56; 95 %CI 0?32, 0?98,respectively). Results persisted across methods. Conclusions: Skipping breakfast does not seem to be related to physical activity,sedentary time or muscular fitness and speed/agility as physical fitness components in European adolescents; yet it is associated with both measured and self-reported cardiorespiratory fitness, which extends previous findings.
Resumo:
The present study aimed to investigate the relationships between macronutrient intake and serum lipid profile in adolescents from eight European cities participating in the HELENA (Healthy Lifestyle in Europe by Nutrition in Adolescence) cross-sectional study (2006–7), and to assess the role of body fat-related variables in these associations. Weight, height, waist circumference, skinfold thicknesses, total choles- terol, HDL-cholesterol (HDL-C), LDL-cholesterol, TAG, apoB and apoA1 were measured in 454 adolescents (44 % boys) aged 12·5–17·5 years. Macronutrient intake (g/4180 kJ per d (1000 kcal per d)) was assessed using two non-consecutive 24 h dietary recalls. Associations were evaluated by multi-level analysis and adjusted for sex, age, maternal education, centre, sum of four skinfolds, moderate-to-vigorous.
Resumo:
ACKNOWLEDGEMENTS We thank Danijela Bacic, Hannah J€ockel, Lea K€ohler, Katrin Molsen, Marie Landenberger, and Annika Plambeck for their assistance with participant recruitment and data collection, Dale Esliger and Lauren Sherar for processing the accelerometry data, and Matthew Riccio for his helpful comments on the manuscript.
Resumo:
Nutrition in bean plants and anthracnose intensity in function of silicon and copper application. The objective of this work was to evaluate the effect of calcium silicate and copper sulfate on anthracnose intensity and nutrition of bean plants. The experiment was conducted using an experimental design in randomized blocks following a 4 x 4 factorial arrangement , (four levels of calcium silicate and four levels of copper sulfate) and two additional treatments (plants without inoculation and plants sprinkled with Benomyl). Four evaluations of the incidence and severity of anthracnose were done, in addition to measuring, total leaf area. At the end of the evaluations, incidence: and data were integrated over time, obtaining the area under disease progress curve (AUDPC). Contents of N, P, K, Ca, Mg, B, Cu, Fe, Mn, Zn, Si and lignin were determined in the aerial Part. A linear decrease of the intensity AUDPC was observed with the increase of the doses of calcium silicate. The severity AUDPC was influenced by the doses of copper, obtaining a reduction of 35% on the higher dosage. The supply of silicon and copper altered the content of the K, mg, S, Zn, Ca and Si in the aerial part of the bean plants.
Resumo:
In an overview of some of the central issues concerning the impact and effects of new technology in adolescence, this article questions the reality of the net generation before considering the interplay of new and old technologies, the internet as both communication and lifestyle resource, and newer technologies like text messaging and webcams.
Resumo:
Non-suicidal self-injury (NSSI) is the deliberate, self-inflicted destruction of body tissue without suicidal intent and an important clinical phenomenon. Rates of NSSI appear to be disproportionately high in adolescents and young adults, and is a risk factor for suicidal ideation and behavior. The present study reports the psychometric properties of the Impulse, Self-harm and Suicide Ideation Questionnaire for Adolescents (ISSIQ-A), a measure designed to comprehensively assess the impulsivity, NSSI behaviors and suicide ideation. An additional module of this questionnaire assesses the functions of NSSI. Results of Confirmatory Factor Analysis (CFA) of the scale on 1722 youths showed items' suitability and confirmed a model of four different dimensions (Impulse, Self-harm, Risk-behavior and Suicide ideation) with good fit and validity. Further analysis showed that youth׳s engagement in self-harm may exert two different functions: to create or alleviate emotional states, and to influence social relationships. Our findings contribute to research and assessment on non-suicidal self-injury, suggesting that the ISSIQ-A is a valid and reliable measure to assess impulse, self-harm and suicidal thoughts, in adolescence.
Resumo:
STUDY OBJECTIVE: The main aim of this study is to evaluate the impact of adolescent pregnancy in the future contraceptive choices. A secondary aim is to verify whether these choices differ from those made after an abortion. DESIGN: Retrospective study. SETTING:Adolescent Unit of a tertiary care center. PARTICIPANTS:212 pregnant teenagers. INTERVENTIONS: Medical records review. MAIN OUTCOME MEASURES:Intended pregnancy rate and contraceptive methods used before and after pregnancy. For contraceptive choices after pregnancy we considered: Group 1 - teenagers who continued their pregnancy to delivery (n = 106) and Group 2 - the same number of adolescents who chose to terminate their pregnancy. RESULTS: The intended pregnancy rate was 14.2%. Prior to a pregnancy continued to delivery, the most widely used contraceptive method was the male condom (50.9%), followed by oral combined contraceptives (28.3%); 18.9% of adolescents were not using any contraceptive method. After pregnancy, contraceptive implant was chosen by 70.8% of subjects (P < .001) and the oral combined contraceptives remained the second most frequent option (17.9%, P = .058). Comparing these results with Group 2, we found that the outcome of the pregnancy was the main factor in the choices that were made. Thus, after a pregnancy continued to delivery, adolescents prefer the use of LARC [78.4% vs 40.5%, OR: 5,958 - 95% (2.914-12.181), P < .001)], especially contraceptive implants [70.8% vs 38.7%, OR: 4.371 - 95% (2.224-8.591), P < .001], to oral combined contraceptives [17.9% vs 57.5%, OR: 0.118 - 95% CI (0.054-0.258), P < .001]. CONCLUSION:Adolescent pregnancy and its outcome constitute a factor of change in future contraceptive choice.
Resumo:
Ovarian pregnancy is one of the rarest types of extrauterine pregnancy. Its preoperative diagnosis remains a challenge since it presents quite similarly to tubal pregnancy and complicated ovarian cysts. Although in most cases, histology is necessary to confirm the diagnosis, we present an ovarian pregnancy in a teenager, correctly diagnosed during ultrasound examination.
Resumo:
PURPOSE: Enteral alimentation is the preferred modality of support in critical patients who have acceptable digestive function and are unable to eat orally, but the advantages of continuous versus intermittent administration are surrounded by controversy. With the purpose of identifying the benefits and complications of each technique, a prospective controlled study with matched subjects was conducted. PATIENTS AND METHODS: Twenty-eight consecutive candidates for enteral feeding were divided into 2 groups (n = 14 each) that were matched for diagnosis and APACHE II score. A commercial immune-stimulating polymeric diet was administered via nasogastric tube by electronic pump in the proportion of 25 kcal/kg/day, either as a 1-hour bolus every 3 hours (Group I), or continuously for 24 hours (Group II), over a 3-day period. Anthropometrics, biochemical measurements, recording of administered drugs and other therapies, thorax X-ray, measurement of abdominal circumference, monitoring of gastric residue, and clinical and nutritional assessments were performed at least once daily. The principal measured outcomes of this protocol were frequency of abdominal distention and pulmonary aspiration, and efficacy in supplying the desired amount of nutrients. RESULTS: Nearly half of the total population (46.4%) exhibited high gastric residues on at least 1 occasion, but only 1 confirmed episode of pulmonary aspiration occurred (3.6%). Both groups displayed a moderate number of complications, without differences. Food input during the first day was greater in Group II (approximately 20% difference), but by the third day, both groups displayed similarly small deficits in total furnished volume of about 10%, when compared with the prescribed diet. CONCLUSIONS: Both administration modalities permitted practical and effective administration of the diet with frequent registered abnormalities but few clinically significant problems. The two groups were similar in this regard, without statistical differences, probably because of meticulous technique, careful monitoring, strict patient matching, and conservative amounts of diet employed in both situations. Further studies with additional populations, diagnostic groups, and dietetic prescriptions should be performed in order to elucidate the differences between these commonly used feeding modalities.
Resumo:
Fruit tree production is gaining an increasing importance in the central Amazon and elsewhere in the humid tropics, but very little is known about the nutrient dynamics in the soil-plant system. The present study quantified the effects of fertilization and cover cropping with a legume (Pueraria phaseoloides (Roxb.) Benth.) on soil nitrogen (N) dynamics and plant nutrition in a young guarana plantation (Paullinia cupana Kunth. (H.B. and K.) var. sorbilis (Mart.) Ducke) on a highly weathered Xanthic Ferralsol. Large subsoil nitrate (NO3-) accumulation at 0.3-3 m below the guarana plantation indicated N leaching from the topsoil. The NO3- contents to a depth of 2 m were 2.4 times greater between the trees than underneath unfertilized trees (P<0.05). The legume cover crop between the trees increased soil N availability as shown by elevated aerobic N mineralization and lower N immobilization in microbial biomass. The guarana N nutrition and yield did not benefit from the N input by biological fixation of atmospheric N2 by the legume cover (P>0.05). Even without a legume intercrop, large amounts of NO3- were found in the subsoil between unfertilized trees. Subsoil NO3- between the trees could be utilized, however, by fertilized guarana. This can be explained by a more vigorous growth of fertilized trees which had a larger nutrient demand and exploited a larger soil volume. With a legume cover crop, however, more mineral N was available at the topsoil which was leached into the subsoil and consequently accumulated at 0.3-3 m depth. Fertilizer additions of P and K were needed to increase subsoil NO3- use between trees.
Resumo:
Critically ill patients depend on artificial nutrition for the maintenance of their metabolic functions and lean body mass, as well as for limiting underfeeding-related complications. Current guidelines recommend enteral nutrition (EN), possibly within the first 48 hours, as the best way to provide the nutrients and prevent infections. EN may be difficult to realize or may be contraindicated in some patients, such as those presenting anatomic intestinal continuity problems or splanchnic ischemia. A series of contradictory trials regarding the best route and timing for feeding have left the medical community with great uncertainty regarding the place of parenteral nutrition (PN) in critically ill patients. Many of the deleterious effects attributed to PN result from inadequate indications, or from overfeeding. The latter is due firstly to the easier delivery of nutrients by PN compared with EN increasing the risk of overfeeding, and secondly to the use of approximate energy targets, generally based on predictive equations: these equations are static and inaccurate in about 70% of patients. Such high uncertainty about requirements compromises attempts at conducting nutrition trials without indirect calorimetry support because the results cannot be trusted; indeed, both underfeeding and overfeeding are equally deleterious. An individualized therapy is required. A pragmatic approach to feeding is proposed: at first to attempt EN whenever and as early as possible, then to use indirect calorimetry if available, and to monitor delivery and response to feeding, and finally to consider the option of combining EN with PN in case of insufficient EN from day 4 onwards.
Resumo:
BACKGROUND & AIMS: Since the publications of the ESPEN guidelines on enteral and parenteral nutrition in ICU, numerous studies have added information to assist the nutritional management of critically ill patients regarding the recognition of the right population to feed, the energy-protein targeting, the route and the timing to start. METHODS: We reviewed and discussed the literature related to nutrition in the ICU from 2006 until October 2013. RESULTS: To identify safe, minimal and maximal amounts for the different nutrients and at the different stages of the acute illness is necessary. These amounts might be specific for different phases in the time course of the patient's illness. The best approach is to target the energy goal defined by indirect calorimetry. High protein intake (1.5 g/kg/d) is recommended during the early phase of the ICU stay, regardless of the simultaneous calorie intake. This recommendation can reduce catabolism. Later on, high protein intake remains recommended, likely combined with a sufficient amount of energy to avoid proteolysis. CONCLUSIONS: Pragmatic recommendations are proposed to practically optimize nutritional therapy based on recent publications. However, on some issues, there is insufficient evidence to make expert recommendations.