713 resultados para SLA-ASL
Resumo:
Medidas mensais da altura da pastagem, biomassa total, variações de biomassa viva e morta, a área específica foliar (SLA) e o Índice de Área de Folha (IAF) de fevereiro de 1999 a janeiro de 2005 na Fazenda Nossa Senhora (FNS) e em Rolim de Moura (RDM) entre Fevereiro a Março de 1999, Rondônia, Brasil. A pastagem predominante é Urochloa brizantha (Hochst. ex A. Rich) R. D. Webster (99% na FNS e 76% em RDM), com pequenas manchas de Urochloa humidicula (Rendle). A altura média anual da grama foi de ~0,16 m. Com o pastejo, o mínimo mensal foi de 0,09 m (estação seca) e máximo de 0,3 m sem pastejo (estação úmida). O IAF, biomassa total, material morto, vivo e SLA tiveram valores médios de 2,5 m² m-2 , 2202 kg ha-1, 2916 kg ha-1 e 19 m² kg-1 respectivamente. A média mensal da biomassa foi 4224 kg ha-1 em 2002 e 6667 kg ha-1 em 2003. Grande variação sazonal do material vivo e morto, sendo mais alto o vivo durante a estação úmida (3229 contra 2529 kg ha-1), sendo o morto maior durante a seca (2542 contra 1894 kg ha-1). O nível de água no solo variou de -3,1 a -6,5 m durante as estações. Em médias anuais os IAF foram de 1,4 em 2000 a 2,8 em 2003 e o SLA entre 16,3 m² kg-1 em 1999 e 20,4 m² kg-1 em 2001. As observações do Albedo variaram de 0,18 para 0,16 em relação aos altos valores de IAF.
Resumo:
Entre los problemas comerciales más importantes que el maní de origen argentino enfrenta en el mercado internacional se destaca la amplia variación en los volúmenes ofertados anualmente, debido a las oscilaciones en la producción y el rendimiento, lo cual dificulta satisfacer plenamente la demanda internacional creciente de maní confitería. Argentina tiene una capacidad exportadora de alrededor de 400.000 toneladas por año. Sin embargo, ante una oferta decreciente de los principales competidores en el mercado internacional, EE.UU y China, la demanda queda insatisfecha, por lo cual nuestro país podría aumentar sus exportaciones en no menos de un 25%, lo que representaría no menos de 150 millones de dólares, que actualmente se dejan de percibir. Problemas de producción y el avance de nuevas tecnologías de cultivo están cambiando el panorama en la producción cordobesa y nacional del maní. Por una parte, la principal región productora argentina se está ampliando hacia el sur de la Provincia de Córdoba y, también, se está desarrollando el cultivo en el noroeste argentino (NOA), ocupándose ambientes sustancialmente distintos a la de la zona núcleo tradicional. Soave (1997), puntualiza que la obtención de nuevas variedades, que amplíen el estrecho panorama varietal existente e incorporen caracteres que respondan a las condiciones locales debe ser un objetivo prioritario y permanente. La literatura proporciona evidencia clara que las especies silvestres del género Arachis son fuentes potenciales de altos niveles de resistencia/tolerancia a enfermedades y sequía. Por otro lado, Fávero (2004) ha demostrado la posibilidad cierta de transferencia de atributos genéticos presente en estas especies al maní cultivado, mediante la construcción de anfidiploides sintéticos de cruzamiento de una especie silvestre con genoma A y otra con genoma B. La cuantificación del contenido de MDA en hoja es un buen parámetro para estimar la resistencia o susceptibilidad de un genotipo a estrés hídrico. Asimismo, estudios recientes indicaron que el uso del área foliar (SLA), el contenido de clorofila (SCMR) y el índice de cosecha, los cuales son sencillos de medir, se correlacionan significativamente con la eficiencia de transpiración y tienen una considerable variación genética en maní. Estos atributos permiten seleccionar genotipos resistentes/tolerantes a sequía. Innovar en el campo de la biotecnología a partir de construcciones genéticas alternativas es una estrategia que permitiría obtener plantas resistentes/tolerantes con la introducción de genes que estén directamente involucrados con los eventos de interés y que ya han sido estudiados en otros organismos. La obtención de plantas con mayor tolerancia a la sequía y/o a enfermedades fúngicas no sólo aseguraría la estabilidad de los rindes en años de escasez hídrica y condiciones climáticas favorables para las enfermedades fúngicas, sino que permitiría extender la frontera productiva a regiones actualmente marginales. Se plantea como objetivo general caracterizar y evaluar fuentes de resistencia/tolerancia a factores bióticos y abióticos, particularmente a Sclerotinia minor, Sclerotinia sclerotiorum y sequía, en gentipos cultivados y silvestres, para transferir los genes de interés mediante técnicas de mejoramiento convencional; y ajustar las condiciones para la regeneración in vito y la transformación génica de variedades elite de maní cultivadas en Argentina.
Resumo:
AbstractBackground:Organ injury occurs not only during periods of ischemia but also during reperfusion. It is known that ischemia reperfusion (IR) causes both remote organ and local injuries.Objective:This study evaluated the effects of tramadol on the heart as a remote organ after acute hindlimb IR.Methods:Thirty healthy mature male Wistar rats were allocated randomly into three groups: Group I (sham), Group II (IR), and Group III (IR + tramadol). Ischemia was induced in anesthetized rats by left femoral artery clamping for 3 h, followed by 3 h of reperfusion. Tramadol (20 mg/kg, intravenous) was administered immediately prior to reperfusion. At the end of the reperfusion, animals were euthanized, and hearts were harvested for histological and biochemical examination.Results:The levels of superoxide dismutase (SOD), catalase (CAT), and glutathione peroxidase (GPx) were higher in Groups I and III than those in Group II (p < 0.05). In comparison with other groups, tissue malondialdehyde (MDA) levels in Group II were significantly increased (p < 0.05), and this increase was prevented by tramadol. Histopathological changes, including microscopic bleeding, edema, neutrophil infiltration, and necrosis, were scored. The total injuryscore in Group III was significantly decreased (p < 0.05) compared with Group II.Conclusion:From the histological and biochemical perspectives, treatment with tramadol alleviated the myocardial injuries induced by skeletal muscle IR in this experimental model.
Resumo:
Abstract Background: Sleep deprivation (SD) is strongly associated with elevated risk for cardiovascular disease. Objective: To determine the effect of SD on basal hemodynamic functions and tolerance to myocardial ischemia-reperfusion (IR) injury in male rats. Method: SD was induced by using the flowerpot method for 4 days. Isolated hearts were perfused with Langendorff setup, and the following parameters were measured at baseline and after IR: left ventricular developed pressure (LVDP); heart rate (HR); and the maximum rate of increase and decrease of left ventricular pressure (±dp/dt). Heart NOx level, infarct size and coronary flow CK-MB and LDH were measured after IR. Systolic blood pressure (SBP) was measured at start and end of study. Results: In the SD group, the baseline levels of LVDP (19%), +dp/dt (18%), and -dp/dt (21%) were significantly (p < 0.05) lower, and HR (32%) was significantly higher compared to the controls. After ischemia, hearts from SD group displayed a significant increase in HR together with a low hemodynamic function recovery compared to the controls. In the SD group, NOx level in heart, coronary flow CK-MB and LDH and infarct size significantly increased after IR; also SD rats had higher SBP after 4 days. Conclusion: Hearts from SD rats had lower basal cardiac function and less tolerance to IR injury, which may be linked to an increase in NO production following IR.
Resumo:
OBJECTIVE: To assess the impact of liver hypertrophy of the future liver remnant volume (FLR) induced by preoperative portal vein embolization (PVE) on the immediate postoperative complications after a standardized major liver resection. SUMMARY BACKGROUND DATA: PVE is usually indicated when FLR is estimated to be too small for major liver resection. However, few data exist regarding the exact quantification of sufficient minimal functional hepatic volume required to avoid postoperative complications in both patients with or without chronic liver disease. METHODS: All consecutive patients in whom an elective right hepatectomy was feasible and who fulfilled the inclusion and exclusion criteria between 1998 and 2000 were assigned to have alternatively either immediate surgery or surgery after PVE. Among 55 patients (25 liver metastases, 2 cholangiocarcinoma, and 28 hepatocellular carcinoma), 28 underwent right hepatectomy after PVE and 27 underwent immediate surgery. Twenty-eight patients had chronic liver disease. FLR and estimated rate of functional future liver remnant (%FFLR) volumes were assessed by computed tomography. RESULTS: The mean increase of FLR and %FFLR 4 to 8 weeks after PVE were respectively 44 +/- 19% and 16 +/- 7% for patients with normal liver and 35 +/- 28% and 9 +/- 3% for those with chronic liver disease. All patients with normal liver and 86% with chronic liver disease experienced hypertrophy after PVE. The postoperative course of patients with normal liver who underwent PVE before right hepatectomy was similar to those with immediate surgery. In contrast, PVE in patients with chronic liver disease significantly decreased the incidence of postoperative complications as well as the intensive care unit stay and total hospital stay after right hepatectomy. CONCLUSIONS: Before elective right hepatectomy, the hypertrophy of FLR induced by PVE had no beneficial effect on the postoperative course in patients with normal liver. In contrast, in patients with chronic liver disease, the hypertrophy of the FLR induced by PVE decreased significantly the rate of postoperative complications.
Resumo:
OBJECTIVES: : To evaluate the outcome after Hartmann's procedure (HP) versus primary anastomosis (PA) with diverting ileostomy for perforated left-sided diverticulitis. BACKGROUND: : The surgical management of left-sided colonic perforation with purulent or fecal peritonitis remains controversial. PA with ileostomy seems to be superior to HP; however, results in the literature are affected by a significant selection bias. No randomized clinical trial has yet compared the 2 procedures. METHODS: : Sixty-two patients with acute left-sided colonic perforation (Hinchey III and IV) from 4 centers were randomized to HP (n = 30) and to PA (with diverting ileostomy, n = 32), with a planned stoma reversal operation after 3 months in both groups. Data were analyzed on an intention-to-treat basis. The primary end point was the overall complication rate. The study was discontinued following an interim analysis that found significant differences of relevant secondary end points as well as a decreasing accrual rate (NCT01233713). RESULTS: : Patient demographics were equally distributed in both groups (Hinchey III: 76% vs 75% and Hinchey IV: 24% vs 25%, for HP vs PA, respectively). The overall complication rate for both resection and stoma reversal operations was comparable (80% vs 84%, P = 0.813). Although the outcome after the initial colon resection did not show any significant differences (mortality 13% vs 9% and morbidity 67% vs 75% in HP vs PA), the stoma reversal rate after PA with diverting ileostomy was higher (90% vs 57%, P = 0.005) and serious complications (Grades IIIb-IV: 0% vs 20%, P = 0.046), operating time (73 minutes vs 183 minutes, P < 0.001), hospital stay (6 days vs 9 days, P = 0.016), and lower in-hospital costs (US $16,717 vs US $24,014) were significantly reduced in the PA group. CONCLUSIONS: : This is the first randomized clinical trial favoring PA with diverting ileostomy over HP in patients with perforated diverticulitis.
Resumo:
El nostre estudi ha analitzat els mecanismes de transferència de la L1 en dos grups de parlants, un que té el romanès i l’altre el tagal com a llengües mare, en el procés d’aprenentatge del català com a L2. En concret ens hem centrat en l’estudi de l’ús dels clítics pronominals, una de les parts de la gramàtica catalana més apassionant per les seves idiosincràsies i la seva diversitat formal. L’ús d’aquest tipus de pronoms és un aspecte que resulta complicat d’adquirir per parlants de llengües en què no existeixen o on l’ús que en fan és diferent. En el nostre estudi, ens hem basat inicialment en la descripció de les característiques fonamentals dels clítics pronominals del català, del romanès i del tagal, a partir de bibliografia especialitzada i més general, i, en el cas del català, fent servir també la pròpia competència. Després d’aquest apartat més descriptiu, hem utilitzat un corpus d’entrevistes fetes en català a dos grups de parlants, el primer amb aprenents de català que tenen el romanès com a L1 i el segon amb aprenents de català que tenen el tagal com a L1, per analitzar quantitativament i qualitativament l’ús dels clítics pronominals. Un altre grup d’entrevistes, aquesta vegada a persones autòctones (català L1), ens ha servit com a grup de control. El nostre estudi dóna suport a la hipòtesi que atorga una importància especial a la transferència de la L1 en l’adquisició d'una segona llengua en general, i en la dels clítics pronominals d’una llengua romànica, en particular. Els resultats mostren que hi ha diferències entre els dos grups d'aprenents que són estadísticament significatives (ús del programa SPSS) i poden ser atribuïdes a les característiques de la L1.
Resumo:
The study tested three analytic tools applied in SLA research (T-unit, AS-unit and Idea-unit) against FL learner monologic oral data. The objective was to analyse their effectiveness for the assessment of complexity of learners' academic production in English. The data were learners' individual productions gathered during the implementation of a CLIL teaching sequence on Natural Sciences in a Catalan state secondary school. The analysis showed that only AS-unit was easily applicable and highly effective in segmenting the data and taking complexity measures
Resumo:
BACKGROUND: A central question for understanding the evolutionary responses of plant species to rapidly changing environments is the assessment of their potential for short-term (in one or a few generations) genetic change. In our study, we consider the case of Pinus pinaster Aiton (maritime pine), a widespread Mediterranean tree, and (i) test, under different experimental conditions (growth chamber and semi-natural), whether higher recruitment in the wild from the most successful mothers is due to better performance of their offspring; and (ii) evaluate genetic change in quantitative traits across generations at two different life stages (mature trees and seedlings) that are known to be under strong selection pressure in forest trees. RESULTS: Genetic control was high for most traits (h2 = 0.137-0.876) under the milder conditions of the growth chamber, but only for ontogenetic change (0.276), total height (0.415) and survival (0.719) under the more stressful semi-natural conditions. Significant phenotypic selection gradients were found in mature trees for traits related to seed quality (germination rate and number of empty seeds). Moreover, female relative reproductive success was significantly correlated with offspring performance for specific leaf area (SLA) in the growth chamber experiment, and stem mass fraction (SMF) in the experiment under semi-natural conditions, two adaptive traits related to abiotic stress-response in pines. Selection gradients based on genetic covariance of seedling traits and responses to selection at this stage involved traits related to biomass allocation (SMF) and growth (as decomposed by a Gompertz model) or delayed ontogenetic change, depending also on the testing environment. CONCLUSIONS: Despite the evidence of microevolutionary change in adaptive traits in maritime pine, directional or disruptive changes are difficult to predict due to variable selection at different life stages and environments. At mature-tree stages, higher female effective reproductive success can be explained by differences in their production of offspring (due to seed quality) and, to a lesser extent, by seemingly better adapted seedlings. Selection gradients and responses to selection for seedlings also differed across experimental conditions. The distinct processes involved at the two life stages (mature trees or seedlings) together with environment-specific responses advice caution when predicting likely evolutionary responses to environmental change in Mediterranean forest trees.
Resumo:
OBJECTIVES: To assess the impact of neoadjuvant chemoradiotherapy (NCRT) on anastomotic leakage (AL) and other postoperative outcomes after esophageal cancer (EC) resection. BACKGROUND: Conflicting data have emerged from randomized studies regarding the impact of NCRT on AL. METHODS: Among 2944 consecutive patients operated on for EC between 2000 and 2010 in 30 European centers, patients treated by NCRT after surgery (n = 593) were compared with those treated by primary surgery (n = 1487). Multivariable analyses and propensity score matching were used to compensate for the differences in some baseline characteristics. RESULTS: Patients in the NCRT group were younger, with a higher prevalence of male sex, malnutrition, advanced tumor stage, squamous cell carcinoma, and surgery after 2005 when compared with the primary surgery group. Postoperative AL rates were 8.8% versus 10.6% (P = 0.220), and 90-day postoperative mortality and morbidity rates were 9.3% versus 7.2% (P = 0.110) and 33.4% versus 32.1% (P = 0.564), respectively. Pulmonary complication rates did not differ between groups (24.6% vs 22.5%; P = 0.291), whereas chylothorax (2.5% vs 1.2%; P = 0.020), cardiovascular complications (8.6% vs 0.1%; P = 0.037), and thromboembolic events (8.6% vs 6.0%; P = 0.037) were higher in the NCRT group. After propensity score matching, AL rates were 8.8% versus 11.3% (P = 0.228), with more chylothorax (2.5% vs 0.7%; P = 0.030) and trend toward more cardiovascular and thromboembolic events in the NCRT group (P = 0.069). Predictors of AL were high American Society of Anesthesiologists scores, supracarinal tumoral location, and cervical anastomosis, but not NCRT. CONCLUSIONS: Neoadjuvant chemoradiotherapy does not have an impact on the AL rate after EC resection (NCT 01927016).
Resumo:
El objetivo de este estudio es describir las alteraciones interictales de perfusión en RM por técnica de arterial spin labelled (ASL) y difusión, en pacientes con epilepsia focal y analizar su posible valor lateralizador/localizador del foco epileptógeno. Se trata de un estudio transversal de 53 pacientes adultos con epilepsia focal diagnosticados por semiología, RM y EEG. En todos ellos se realizó una RM de 3 TESLA con protocolo de epilepsia, que incluía secuencias de ASL. Las imágenes fueron sometidas a un análisis visual por un neurorradiólogo, clasificándolas en alteraciones de perfusión hemisféricas o focales. La muestra tenía un 51% de hombres y una edad media de 42.9 años (±16.5). El 60% tenían epilepsias sintomáticas. El 64% eran fármacorresistentes. Las etiologías más frecuentes fueron vascular (15%), malformaciones del desarrollo cortical (15%) y tumoral (13%). Un 45% se clasificaron como epilepsia temporal, 32% frontal y 13% temporal posterior, 8% occipital y 2% parietal. El 45% presentaban crisis parciales complejas, entre las que la semiología automotora era la más frecuente (36%). El ASL mostró, en un 74% de los pacientes, alteraciones de la perfusión interhemisféricas, observándose un valor lateralizador de éstas, especialmente cuando se observa hiperperfusión en la localización del foco y cuando se trata de epilepsia sintomática. Se observaron alteraciones focales en ASL que a pesar de encontrarse en un bajo porcentaje podrían tener valor localizador del área epileptógena.
Resumo:
Health Act 2007 AN ACT TO ESTABLISH A BODY TO BE KNOWN AS AN tU´ DARA´ S UM FHAISNE´ IS AGUS CA´ ILI´OCHT SLA´ INTE OR, IN THE ENGLISH LANGUAGE, AS THE HEALTH INFORMATION AND QUALITY AUTHORITY AND OIFIG AN PHRI´OMH-CHIGIRE SEIRBHI´SI´ SO´ ISIALACHA OR, IN THE ENGLISH LANGUAGE, THE OFFICE OF THE CHIEF INSPECTOR OF SOCIAL SERVICES ANDTO PROVIDE FOR THE DISSOLUTION OF CERTAIN BODIES; TO PROVIDE FOR THE TRANSFER OF THE FUNCTIONS OF THE DISSOLVED BODIES AND THEIR EMPLOYEES TO THE HEALTH INFORMATION AND QUALITY AUTHORITY; Click here to download PDF 534kb
Resumo:
OBJECTIVE: To compare surgical site infection (SSI) rates in open or laparoscopic appendectomy, cholecystectomy, and colon surgery. To investigate the effect of laparoscopy on SSI in these interventions. BACKGROUND: Lower rates of SSI have been reported among various advantages associated with laparoscopy when compared with open surgery, particularly in cholecystectomy. However, biases such as the lack of postdischarge follow-up and confounding factors might have contributed to the observed differences between the 2 techniques. METHODS: This observational study was based on prospectively collected data from an SSI surveillance program in 8 Swiss hospitals between March 1998 and December 2004, including a standardized postdischarge follow-up. SSI rates were compared between laparoscopic and open interventions. Factors associated with SSI were identified by using logistic regression models to adjust for potential confounding factors. RESULTS: SSI rates in laparoscopic and open interventions were respectively 59/1051 (5.6%) versus 117/1417 (8.3%) in appendectomy (P = 0.01), 46/2606 (1.7%) versus 35/444 (7.9%) in cholecystectomy (P < 0.0001), and 35/311 (11.3%) versus 400/1781 (22.5%) in colon surgery (P < 0.0001). After adjustment, laparoscopic interventions were associated with a decreased risk for SSI: OR = 0.61 (95% CI 0.43-0.87) in appendectomy, 0.27 (0.16-0.43) in cholecystectomy, and 0.43 (0.29-0.63) in colon surgery. The observed effect of laparoscopic techniques was due to a reduction in the rates of incisional infections, rather than in those of organ/space infections. CONCLUSION: When feasible, a laparoscopic approach should be preferred over open surgery to lower the risks of SSI.
Resumo:
The specificity of human antileishmanial IgG and IgE antibodies to glycosylated antigens of Leishmania chagasi was evaluated. An ELISA was performed with soluble leishmanial antigen (SLA) and a panel of 95 sera including samples from patients with subclinical infection (SC) and visceral leishmaniasis (VL), subjects cured of visceral leishmaniasis (CVL), and from healthy individuals from endemic areas (HIEA). Antileishmanial IgG were verified for 18 (40%) of 45 SC subjects (mean absorbance of 0.49 ± 0.17). All nine sera from VL patients had such antibody (0.99 ± 0.21), while 11 (65%) of 17 CVL individuals were seropositive (0.46 ± 0.05). Only three (12%) of 24 HIEA controls reacted in IgG-ELISA. Antileishmanial IgE was detected in 26 (58%) of 45 SC patients (0.35 ± 0.14), and in all VL patients (0.65 ± 0.29). These antibodies were also detected in 13(76%) of 17 CVL subjects (0.42 ± 0.14) while all HIEA controls were seronegative. There was no correlation between antileishmanial IgG and IgE antibody absorbances. Mild periodate oxidation at acid pH of SLA carbohydrates drastically diminished its antigenicity in both IgG and IgE-ELISA, affecting mainly the antigens of 125, 102, 94, and 63 kDa as demonstrated by western immunoblotting.
Resumo:
OBJECTIVE:: The purpose of this study was to assess outcomes and indications in a large cohort of patients who underwent liver transplantation (LT) for liver metastases (LM) from neuroendocrine tumors (NET) over a 27-year period. BACKGROUND:: LT for NET remains controversial due to the absence of clear selection criteria and the scarcity and heterogeneity of reported cases. METHODS:: This retrospective multicentric study included 213 patients who underwent LT for NET performed in 35 centers in 11 European countries between 1982 and 2009. One hundred seven patients underwent transplantation before 2000 and 106 after 2000. Mean age at the time of LT was 46 years. Half of the patients presented hormone secretion and 55% had hepatomegaly. Before LT, 83% of patients had undergone surgical treatment of the primary tumor and/or LM and 76% had received chemotherapy. The median interval between diagnosis of LM and LT was 25 months (range, 1-149 months). In addition to LT, 24 patients underwent major resection procedures and 30 patients underwent minor resection procedures. RESULTS:: Three-month postoperative mortality was 10%. At 5 years after LT, overall survival (OS) was 52% and disease-free survival was 30%. At 5 years from diagnosis of LM, OS was 73%. Multivariate analysis identified 3 predictors of poor outcome, that is, major resection in addition to LT, poor tumor differentiation, and hepatomegaly. Since 2000, 5-year OS has increased to 59% in relation with fewer patients presenting poor prognostic factors. Multivariate analysis of the 106 cases treated since 2000 identified the following predictors of poor outcome: hepatomegaly, age more than 45 years, and any amount of resection concurrent with LT. CONCLUSIONS:: LT is an effective treatment of unresectable LM from NET. Patient selection based on the aforementioned predictors can achieve a 5-year OS between 60% and 80%. However, use of overly restrictive criteria may deny LT to some patients who could benefit. Optimal timing for LT in patients with stable versus progressive disease remains unclear.