990 resultados para no-choice feeding tests
Resumo:
Despite the development of many effective antihypertensive drugs, target blood pressures are reached in only a minority of patients in clinical practice. Poor adherence to drug therapy and the occurrence of side effects are among the main reasons commonly reported by patients and physicians to explain the poor results of actual antihypertensive therapies. The development of new effective antihypertensive agents with an improved tolerability profile might help to partly overcome these problems. Lercanidipine is an effective dihydropyridine calcium channel blocker of the third generation characterized by a long half-life and its lipophylicity. In contrast to first-generation dihydropyridines, lercanidipine does not induce reflex tachycardia and induces peripheral edema with a lower incidence. Recent data suggest that in addition to lowering blood pressure, lercanidipine might have some renal protective properties. In this review we shall discuss the problems of drug adherence in the management of hypertension with a special emphasis on lercanidipine.
Resumo:
BACKGROUND Current guidelines give recommendations for preferred combination antiretroviral therapy (cART). We investigated factors influencing the choice of initial cART in clinical practice and its outcome. METHODS We analyzed treatment-naive adults with human immunodeficiency virus (HIV) infection participating in the Swiss HIV Cohort Study and starting cART from January 1, 2005, through December 31, 2009. The primary end point was the choice of the initial antiretroviral regimen. Secondary end points were virologic suppression, the increase in CD4 cell counts from baseline, and treatment modification within 12 months after starting treatment. RESULTS A total of 1957 patients were analyzed. Tenofovir-emtricitabine (TDF-FTC)-efavirenz was the most frequently prescribed cART (29.9%), followed by TDF-FTC-lopinavir/r (16.9%), TDF-FTC-atazanavir/r (12.9%), zidovudine-lamivudine (ZDV-3TC)-lopinavir/r (12.8%), and abacavir/lamivudine (ABC-3TC)-efavirenz (5.7%). Differences in prescription were noted among different Swiss HIV Cohort Study sites (P < .001). In multivariate analysis, compared with TDF-FTC-efavirenz, starting TDF-FTC-lopinavir/r was associated with prior AIDS (relative risk ratio, 2.78; 95% CI, 1.78-4.35), HIV-RNA greater than 100 000 copies/mL (1.53; 1.07-2.18), and CD4 greater than 350 cells/μL (1.67; 1.04-2.70); TDF-FTC-atazanavir/r with a depressive disorder (1.77; 1.04-3.01), HIV-RNA greater than 100 000 copies/mL (1.54; 1.05-2.25), and an opiate substitution program (2.76; 1.09-7.00); and ZDV-3TC-lopinavir/r with female sex (3.89; 2.39-6.31) and CD4 cell counts greater than 350 cells/μL (4.50; 2.58-7.86). At 12 months, 1715 patients (87.6%) achieved viral load less than 50 copies/mL and CD4 cell counts increased by a median (interquartile range) of 173 (89-269) cells/μL. Virologic suppression was more likely with TDF-FTC-efavirenz, and CD4 increase was higher with ZDV-3TC-lopinavir/r. No differences in outcome were observed among Swiss HIV Cohort Study sites. CONCLUSIONS Large differences in prescription but not in outcome were observed among study sites. A trend toward individualized cART was noted suggesting that initial cART is significantly influenced by physician's preference and patient characteristics. Our study highlights the need for evidence-based data for determining the best initial regimen for different HIV-infected persons.
Resumo:
With the trend in molecular epidemiology towards both genome-wide association studies and complex modelling, the need for large sample sizes to detect small effects and to allow for the estimation of many parameters within a model continues to increase. Unfortunately, most methods of association analysis have been restricted to either a family-based or a case-control design, resulting in the lack of synthesis of data from multiple studies. Transmission disequilibrium-type methods for detecting linkage disequilibrium from family data were developed as an effective way of preventing the detection of association due to population stratification. Because these methods condition on parental genotype, however, they have precluded the joint analysis of family and case-control data, although methods for case-control data may not protect against population stratification and do not allow for familial correlations. We present here an extension of a family-based association analysis method for continuous traits that will simultaneously test for, and if necessary control for, population stratification. We further extend this method to analyse binary traits (and therefore family and case-control data together) and accurately to estimate genetic effects in the population, even when using an ascertained family sample. Finally, we present the power of this binary extension for both family-only and joint family and case-control data, and demonstrate the accuracy of the association parameter and variance components in an ascertained family sample.
Resumo:
Cyst-based ecotoxicological tests are simple and low-cost methods for assessing acute toxicity. Nevertheless, only a few comparative studies on their sensitivity are known. In the present study, the suitability of the use of two freshwater Anostracan species, Streptocephalus rubricaudatus and S. texanus, was assessed. The impact of 16 priority pollutants (4 heavy metals, 11 organic, and 1 organometallic compounds) on these two species, as well as on Artemia salina (Artoxkit M), Daphnia magna (International Organization for Standardization 6341), and S. proboscideus (Streptoxkit F) was assessed. For indicative comparison, bioassays using Brachionus calyciflorus (Rotoxkit F) and Photobacterium phosphoreum (Microtox) were also performed. For heavy metals (K2Cr2O7, Cd2+, Zn2+, Cu2+), the sensitivity of the two studied Streptocephalus species was slightly higher than that of D. magna. It was significantly more elevated than for the marine A. salina. For organic and organometallic micropollutants [phenol, 3,5-dichlorophenol, pentachlorophenol (PCP), hydroquinone, linear alkylbenzene sulfonate, sodium dodecyl sulfate, tributylphosphate, dimethylphthalate, atrazine, lindane, malathion, tributyltin chloride (TBT-Cl)], the sensitivity of the 4 anostracan species was of the same order of magnitude as that of D. magna. Artemia salina was slightly less sensitive to some organic compounds (PCP, hydroquinone, TBT-Cl). The sensitivity of S. rubricaudatus to organic solvents was low. On the other hand, this anostracan was quite sensitive to NaCl. Thus, its use is restricted to freshwater samples. The evaluation of global practicability of these two tests confirms that cyst-based freshwater anostracans may be used to perform low-cost tests at a sensitivity comparable to that of D. magna (24 h immobilization test).
Resumo:
This study aimed to compare two different maximal incremental tests with different time durations [a maximal incremental ramp test with a short time duration (8-12 min) (STest) and a maximal incremental test with a longer time duration (20-25 min) (LTest)] to investigate whether an LTest accurately assesses aerobic fitness in class II and III obese men. Twenty obese men (BMI≥35 kg.m-2) without secondary pathologies (mean±SE; 36.7±1.9 yr; 41.8±0.7 kg*m-2) completed an STest (warm-up: 40 W; increment: 20 W*min-1) and an LTest [warm-up: 20% of the peak power output (PPO) reached during the STest; increment: 10% PPO every 5 min until 70% PPO was reached or until the respiratory exchange ratio reached 1.0, followed by 15 W.min-1 until exhaustion] on a cycle-ergometer to assess the peak oxygen uptake [Formula: see text] and peak heart rate (HRpeak) of each test. There were no significant differences in [Formula: see text] (STest: 3.1±0.1 L*min-1; LTest: 3.0±0.1 L*min-1) and HRpeak (STest: 174±4 bpm; LTest: 173±4 bpm) between the two tests. Bland-Altman plot analyses showed good agreement and Pearson product-moment and intra-class correlation coefficients showed a strong correlation between [Formula: see text] (r=0.81 for both; p≤0.001) and HRpeak (r=0.95 for both; p≤0.001) during both tests. [Formula: see text] and HRpeak assessments were not compromised by test duration in class II and III obese men. Therefore, we suggest that the LTest is a feasible test that accurately assesses aerobic fitness and may allow for the exercise intensity prescription and individualization that will lead to improved therapeutic approaches in treating obesity and severe obesity.
Resumo:
The application of organic wastes to agricultural soils is not risk-free and can affect soil invertebrates. Ecotoxicological tests based on the behavioral avoidance of earthworms and springtails were performed to evaluate effects of different fertilization strategies on soil quality and habitat function for soil organisms. These tests were performed in soils treated with: i) slurry and chemical fertilizers, according to the conventional fertilization management of the region, ii) conventional fertilization + sludge and iii) unfertilized reference soil. Both fertilization strategies contributed to soil acidity mitigation and caused no increase in soil heavy metal content. Avoidance test results showed no negative effects of these strategies on soil organisms, compared with the reference soil. However, results of the two fertilization managements differed: Springtails did not avoid soils fertilized with dairy sludge in any of the tested combinations. Earthworms avoided soils treated with sludge as of May 2004 (DS1), when compared with conventional fertilization. Possibly, the behavioral avoidance of earthworms is more sensitive to soil properties (other than texture, organic matter and heavy metal content) than springtails
Resumo:
Although tissue engineering and cell therapies are becoming realistic approaches for medical therapeutics, it is likely that musculoskeletal applications will be among the first to benefit on a large scale. Cell sources for tissue engineering and cell therapies for tendon pathologies are reviewed with an emphasis on small defect tendon injuries as seen in the hand which could adapt well to injectable cell administration. Specifically, cell sources including tenocytes, tendon sheath fibroblasts, bone marrow or adipose-derived stem cells, amniotic cells, placenta cells and platelet-derivatives have been proposed to enhance tendon regeneration. The associated advantages and disadvantages for these different strategies will be discussed and evolving regulatory requirements for cellular therapies will also be addressed. Human progenitor tenocytes, along with their clinical cell banking potential, will be presented as an alternative cell source solution. Similar cell banking techniques have already been described with other progenitor cell types in the 1950's for vaccine production, and these "old" cell types incite potentially interesting therapeutic options that could be improved with modern innovation for tendon regeneration and repair.
Resumo:
Many diurnal bird species roost at night in holes. As a regular visitor of a hole they are therefore a welcome host for several species of ectoparasites. The interactions of ectoparasites with the behaviour, life-history traits and population demography of their hosts are largely unknown. In the present study the effects of the haematophagous hen flea, Ceratophyllus gallinae , on the great tit's choice of winter roost site were investigated experimentally. Three experiments tested (1) whether great tits prefer a clean nestbox to one containing an old, but parasite-free nest, (2) whether they prefer a parasite-free nestbox to one infested with the haematophagous hen flea, and (3) whether they prefer not to use a nestbox when there is only an infested box available in their territory. In the first experiment there was no discrimination and both kinds of boxes were used equally often. In the second experiment the great tits clearly preferred to roost in the box without ectoparasites. In the third experiment a significantly higher proportion of the infested nestboxes were not used for roosting compared with the parasite-free boxes. Recently the validity of the conclusions drawn from nestbox studies where the naturally occurring detrimental ectoparasites are eliminated by the routine removal of old nests between breeding seasons has been questioned. This study shows that ectoparasites affect host behaviour and therefore lends support to that criticism.
Resumo:
Summary
Resumo:
Selostus: Väkirehuruokinnan vaikutus maidontuotantoon karjantarkkailutiloilta kerätyssä kenttäaineistossa
Resumo:
Selostus: Kationi-anionitasapaino ummessaolevien lypsylehmien säilörehuruokinnassa kalsiumin saannin ollessa runsas
Resumo:
Selostus: Kationi-anionitasapaino ja kalsiumin saanti ummessaolevien lypsylehmien säilörehuruokinnassa
Resumo:
Résumé Introduction : Les patients nécessitant une prise en charge prolongée en milieu de soins intensifs et présentant une évolution compliquée, développent une réponse métabolique intense caractérisée généralement par un hypermétabolisme et un catabolisme protéique. La sévérité de leur atteinte pathologique expose ces patients à la malnutrition, due principalement à un apport nutritionnel insuffisant, et entraînant une balance énergétique déficitaire. Dans un nombre important d'unités de soins intensifs la nutrition des patients n'apparaît pas comme un objectif prioritaire de la prise en charge. En menant une étude prospective d'observation afin d'analyser la relation entre la balance énergétique et le pronostic clinique des patients avec séjours prolongés en soins intensifs, nous souhaitions changer cette attitude et démonter l'effet délétère de la malnutrition chez ce type de patient. Méthodes : Sur une période de 2 ans, tous les patients, dont le séjour en soins intensifs fut de 5 jours ou plus, ont été enrôlés. Les besoins en énergie pour chaque patient ont été déterminés soit par calorimétrie indirecte, soit au moyen d'une formule prenant en compte le poids du patient (30 kcal/kg/jour). Les patients ayant bénéficié d'une calorimétrie indirecte ont par ailleurs vérifié la justesse de la formule appliquée. L'âge, le sexe le poids préopératoire, la taille, et le « Body mass index » index de masse corporelle reconnu en milieu clinique ont été relevés. L'énergie délivrée l'était soit sous forme nutritionnelle (administration de nutrition entérale, parentérale ou mixte) soit sous forme non-nutritionnelle (perfusions : soluté glucosé, apport lipidique non nutritionnel). Les données de nutrition (cible théorique, cible prescrite, énergie nutritionnelle, énergie non-nutritionnelle, énergie totale, balance énergétique nutritionnelle, balance énergétique totale), et d'évolution clinique (nombre des jours de ventilation mécanique, nombre d'infections, utilisation des antibiotiques, durée du séjour, complications neurologiques, respiratoires gastro-intestinales, cardiovasculaires, rénales et hépatiques, scores de gravité pour patients en soins intensifs, valeurs hématologiques, sériques, microbiologiques) ont été analysées pour chacun des 669 jours de soins intensifs vécus par un total de 48 patients. Résultats : 48 patients de 57±16 ans dont le séjour a varié entre 5 et 49 jours (motif d'admission : polytraumatisés 10; chirurgie cardiaque 13; insuffisance respiratoire 7; pathologie gastro-intestinale 3; sepsis 3; transplantation 4; autre 8) ont été retenus. Si nous n'avons pu démontrer une relation entre la balance énergétique et plus particulièrement, le déficit énergétique, et la mortalité, il existe une relation hautement significative entre le déficit énergétique et la morbidité, à savoir les complications et les infections, qui prolongent naturellement la durée du séjour. De plus, bien que l'étude ne comporte aucune intervention et que nous ne puissions avancer qu'il existe une relation de cause à effet, l'analyse par régression multiple montre que le facteur pronostic le plus fiable est justement la balance énergétique, au détriment des scores habituellement utilisés en soins intensifs. L'évolution est indépendante tant de l'âge et du sexe, que du status nutritionnel préopératoire. L'étude ne prévoyait pas de récolter des données économiques : nous ne pouvons pas, dès lors, affirmer que l'augmentation des coûts engendrée par un séjour prolongé en unité de soins intensifs est induite par un déficit énergétique, même si le bon sens nous laisse penser qu'un séjour plus court engendre un coût moindre. Cette étude attire aussi l'attention sur l'origine du déficit énergétique : il se creuse au cours de la première semaine en soins intensifs, et pourrait donc être prévenu par une intervention nutritionnelle précoce, alors que les recommandations actuelles préconisent un apport énergétique, sous forme de nutrition artificielle, qu'à partir de 48 heures de séjour aux soins intensifs. Conclusions : L'étude montre que pour les patients de soins intensifs les plus graves, la balance énergétique devrait être considérée comme un objectif important de la prise en charge, nécessitant l'application d'un protocole de nutrition précoce. Enfin comme l'évolution à l'admission des patients est souvent imprévisible, et que le déficit s'installe dès la première semaine, il est légitime de s'interroger sur la nécessité d'appliquer ce protocole à tous les patients de soins intensifs et ceci dès leur admission. Summary Background and aims: Critically ill patients with complicated evolution are frequently hypermetabolic, catabolic, and at risk of underfeeding. The study aimed at assessing the relationship between energy balance and outcome in critically ill patients. Methods: Prospective observational study conducted in consecutive patients staying 5 days in the surgical ICU of a University hospital. Demographic data, time to feeding, route, energy delivery, and outcome were recorded. Energy balance was calculated as energy delivery minus target. Data in means+ SD, linear regressions between energy balance and outcome variables. Results: Forty eight patients aged 57±16 years were investigated; complete data are available in 669 days. Mechanical ventilation lasted 11±8 days, ICU stay 15+9 was days, and 30-days mortality was 38%. Time to feeding was 3.1 ±2.2 days. Enteral nutrition was the most frequent route with 433 days. Mean daily energy delivery was 1090±930 kcal. Combining enteral and parenteral nutrition achieved highest energy delivery. Cumulated energy balance was between -12,600+ 10,520 kcal, and correlated with complications (P<0.001), already after 1 week. Conclusion: Negative energy balances were correlated with increasing number of complications, particularly infections. Energy debt appears as a promising tool for nutritional follow-up, which should be further tested. Delaying initiation of nutritional support exposes the patients to energy deficits that cannot be compensated later on.