882 resultados para analysis to synthesis
Resumo:
Falciparum malaria represents a serious and an increasing world public health problem due to the acquired parasite's resistance to the most available drugs. In some endemic areas, quinidine, a diastereoisomer of the antimalarial quinine, has been employed for replacing the latter. In order to evaluate the use of quinidine as an alternative to the increasing loss of quinine effectiveness in Brazilian P. falciparum strains, as has been observed in the Amazon area, we have assayed quinidine, quinine and chloroquine. The in vitro microtechnique was employed. All isolates showed to be highly resistant to chloroquine. Resistance to quinine was not noted although high MIC (minimal inhibitory concentration) values have been observed. These data corroborate the decreasing sensitivity to quinine in strains from Brazil. Quinidine showed IC50 from 0.053 to 4.577 mumol/L of blood while IC50 from 0.053 to 8.132 mumol/L of blood was estimated for quinine. Moreover, clearance of the parasitemia was observed in concentrations lower than that used for quinidine in antiarrhythmic therapy, confirming our previous data. The results were similar to African isolate.
Resumo:
The purpose of this study was to identify parents and obtain segregating populations of cowpea (Vigna unguiculata L. Walp.) with the potential for tolerance to water deficit. A full diallel was performed with six cowpea genotypes, and two experiments were conducted in Teresina, PI, Brazil in 2011 to evaluate 30 F2 populations and their parents, one under water deficit and the other under full irrigation.
Resumo:
Body composition analysis is relevant to characterize the nutritional requirements and finishing phase of fish. The aim of this study was to investigate the relationship between ichthyometric (weight, total and standard length, density and yields), bromatological (fat, protein, ash and water content) and bioelectrical-impedance-analysis (BIA) (resistance, reactance, phase angle and composition indexes) variables in the hybrid tambatinga (Colossoma macropomum × Piaractus brachypomus). In a non-fertilized vivarium, 520 juveniles were housed and fed commercial rations. Then, 136 days after hatching (DAH), 15 fish with an average weight of 37.69 g and average total length of 12.96 cm were randomly chosen, anesthetized (eugenol) and subjected to the first of fourteen fortnightly assessments (BIA and biometry). After euthanasia, the following parts were weighed: whole carcass with the head, fillet, and skin (WC); fillet with skin (FS); and the remainder of the carcass with the head (CH). Together, FS and CH were ground and homogenized for the bromatological analyses. Estimates of the body composition and yields of tambatinga, with models including ichthyometric and BIA variables, showed correlation coefficients ranging from 0.81 (for the FS yield) to 1,00 (for the total ash). Similarly, models that included only BIA variables had correlation coefficients ranging from 0.81 (FS and CH yields) to 0.98 (for the total ash). Therefore, in tambatinga, the BIA technique allows the estimation of the yield of the fillet with skin and the body composition (water content, fat, ash, and protein). The best models combine ichthyometric and BIA variables.
Resumo:
Executive functioning (EF), which is considered to govern complex cognition, and verbal memory (VM) are constructs assumed to be related. However, it is not known the magnitude of the association between EF and VM, and how sociodemographic and psychological factors may affect this relationship, including in normal aging. In this study, we assessed different EF and VM parameters, via a battery of neurocognitive/psychological tests, and performed a Canonical Correlation Analysis (CCA) to explore the connection between these constructs, in a sample of middle- aged and older healthy individuals without cognitive impairment (N = 563, 50+ years of age). The analysis revealed a positive and moderate association between EF and VM independently of gender, age, education, global cognitive performance level, and mood. These results confirm that EF presents a significant association with VM performance.
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2013
Resumo:
BACKGROUND: Excision and primary midline closure for pilonidal disease (PD) is a simple procedure; however, it is frequently complicated by infection and prolonged healing. The aim of this study was to analyze risk factors for surgical site infection (SSI) in this context. METHODS: All consecutive patients undergoing excision and primary closure for PD from January 2002 through October 2008 were retrospectively assessed. The end points were SSI, as defined by the Center for Disease Control, and time to healing. Univariable and multivariable risk factor analyses were performed. RESULTS: One hundred thirty-one patients were included [97 men (74%), median age = 24 (range 15-66) years]. SSI occurred in 41 (31%) patients. Median time to healing was 20 days (range 12-76) in patients without SSI and 62 days (range 20-176) in patients with SSI (P < 0.0001). In univariable and multivariable analyses, smoking [OR = 2.6 (95% CI 1.02, 6.8), P = 0.046] and lack of antibiotic prophylaxis [OR = 5.6 (95% CI 2.5, 14.3), P = 0.001] were significant predictors for SSI. Adjusted for SSI, age over 25 was a significant predictor of prolonged healing. CONCLUSION: This study suggests that the rate of SSI after excision and primary closure of PD is higher in smokers and could be reduced by antibiotic prophylaxis. SSI significantly prolongs healing time, particularly in patients over 25 years.
Resumo:
OBJECTIVES: Perioperative fluid accumulation determination is a challenge for the clinician. Bioelectrical impedance analysis (BIA) is a noninvasive method based on the electrical properties of tissues, which can assess body fluid compartments. The study aimed at assessing their changes in three types of surgery (thoracic, abdominal, and intracranial) requiring various regimens of fluid administration. DESIGN: Prospective descriptive trial. PATIENTS: A total of 26 patients scheduled for elective surgery were separated into three groups according to site of surgery: thoracic (n = 8), abdominal aortic (n = 8), and brain surgery (n = 10). SETTING: University teaching hospital. INTERVENTION: None. MEASUREMENTS: Whole body, segmental (arm, trunk, and legs) BIA at multiple frequency (0.5, 50, 100 kHz) was used to assess perioperative fluid accumulation after surgery. The fluid balances were calculated from the charts. RESULTS: The patients were aged 62+/-4 yrs. Fluid balances were 4.8+/-1.0 L, 4.1+/-0.5 L, and 1.9+/-0.3 L, respectively, in the three groups. In trunk surgery patients, fluid accumulation was detected as a drop in impedance in the operated area at all frequencies. In the operated area, there was an expansion of both intra- and extracellular compartments. A reduction in high frequencies' impedance in the legs was only detected after aortic surgery. Fluid accumulation and trunk impedance changes were strongly correlated. Neurosurgery only induced minor body fluid changes. CONCLUSIONS: Segmental BIA is able to detect and localize perioperative fluid accumulation. It may become a bedside tool to quantify and to localize fluid accumulation.
Resumo:
In an earlier investigation (Burger et al., 2000) five sediment cores near the RodriguesTriple Junction in the Indian Ocean were studied applying classical statistical methods(fuzzy c-means clustering, linear mixing model, principal component analysis) for theextraction of endmembers and evaluating the spatial and temporal variation ofgeochemical signals. Three main factors of sedimentation were expected by the marinegeologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. Thedisplay of fuzzy membership values and/or factor scores versus depth providedconsistent results for two factors only; the ultra-basic component could not beidentified. The reason for this may be that only traditional statistical methods wereapplied, i.e. the untransformed components were used and the cosine-theta coefficient assimilarity measure.During the last decade considerable progress in compositional data analysis was madeand many case studies were published using new tools for exploratory analysis of thesedata. Therefore it makes sense to check if the application of suitable data transformations,reduction of the D-part simplex to two or three factors and visualinterpretation of the factor scores would lead to a revision of earlier results and toanswers to open questions . In this paper we follow the lines of a paper of R. Tolosana-Delgado et al. (2005) starting with a problem-oriented interpretation of the biplotscattergram, extracting compositional factors, ilr-transformation of the components andvisualization of the factor scores in a spatial context: The compositional factors will beplotted versus depth (time) of the core samples in order to facilitate the identification ofthe expected sources of the sedimentary process.Kew words: compositional data analysis, biplot, deep sea sediments
Resumo:
In this research, we analyse the contact-specific mean of the final cooperation probability, distinguishing on the one hand between contacts with household reference persons and with other eligible household members, and on the other hand between first and later contacts. Data comes from two Swiss Household Panel surveys. The interviewer-specific variance is higher for first contacts, especially in the case of the reference person. For later contacts with the reference person, the contact-specific variance dominates. This means that interaction effects and situational factors are decisive. The contact number has negative effects on the performance of contacts with the reference person, positive in the case of other persons. Also time elapsed since the previous contact has negative effects in the case of reference persons. The result of the previous contact has strong effects, especially in the case of the reference person. These findings call for a quick completion of the household grid questionnaire, assigning the best interviewers to conducting the first contact. While obtaining refusals has negative effects, obtaining other contact results has only weak effects on the interviewer's subsequent contact outcome. Using the same interviewer for contacts has no positive effects.
Resumo:
BACKGROUND AND PURPOSE: A right-to-left shunt can be identified by contrast transcranial Doppler ultrasonography (c-TCD) at rest and/or after a Valsalva maneuver (VM) or by arterial blood gas (ABG) measurement. We assessed the influence of controlled strain pressures and durations during VM on the right-to-left passage of microbubbles, on which depends the shunt classification by c-TCD, and correlated it with the right-to-left shunt evaluation by ABG measurements in stroke patients with patent foramen ovale (PFO). METHODS: We evaluated 40 stroke patients with transesophageal echocardiography-documented PFO. The microbubbles were recorded with TCD at rest and after 4 different VM conditions with controlled duration and target strain pressures (duration in seconds and pressure in cm H2O, respectively): V5-20, V10-20, V5-40, and V10-40. The ABG analysis was performed after pure oxygen breathing in 34 patients, and the shunt was calculated as percentage of cardiac output. RESULTS: Among all VM conditions, V5-40 and V10-40 yielded the greatest median number of microbubbles (84 and 95, respectively; P<0.01). A significantly larger number of microbubbles were detected in V5-40 than in V5-20 (P<0.001) and in V10-40 than in V10-20 (P<0.01). ABG was not sensitive enough to detect a shunt in 31 patients. CONCLUSIONS: The increase of VM expiratory pressure magnifies the number of microbubbles irrespective of the strain duration. Because the right-to-left shunt classification in PFO is based on the number of microbubbles, a controlled VM pressure is advised for a reproducible shunt assessment. The ABG measurement is not sensitive enough for shunt assessment in stroke patients with PFO.
Resumo:
The growing multilingual trend in movie production comes with a challenge for dubbing translators since they are increasingly confronted with more than one source language. The main purpose of this master’s thesis is to provide a case study on how these third languages (see CORRIUS and ZABALBEASCOA 2011) are rendered. Another aim is to put a particular focus on their textual and narrative functions and detect possible shifts that might occur in translations. By applying a theoretical model for translation analysis (CORRIUS and ZABALBEASCOA 2011), this study describes how third languages are rendered in the German, Spanish, and Italian dubbed versions of the 2009 Tarantino movie Inglourious Basterds. A broad range of solution-types are thereby revealed and prevalent restrictions of the translation process identified. The target texts are brought in context with some sociohistorical aspects of dubbing in order to detect prevalent norms of the respective cultures andto discuss the acceptability of translations (TOURY 1995). The translatability potential of even highly complex multilingual audiovisual texts is demonstrated in this study. Moreover, proposals for further studies in multilingual audiovisual translation are outlined and the potential for future investigations in this field thereby emphasised.
Resumo:
General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.
Resumo:
The generalization of simple (two-variable) correspondence analysis to more than two categorical variables, commonly referred to as multiple correspondence analysis, is neither obvious nor well-defined. We present two alternative ways of generalizing correspondence analysis, one based on the quantification of the variables and intercorrelation relationships, and the other based on the geometric ideas of simple correspondence analysis. We propose a version of multiple correspondence analysis, with adjusted principal inertias, as the method of choice for the geometric definition, since it contains simple correspondence analysis as an exact special case, which is not the situation of the standard generalizations. We also clarify the issue of supplementary point representation and the properties of joint correspondence analysis, a method that visualizes all two-way relationships between the variables. The methodology is illustrated using data on attitudes to science from the International Social Survey Program on Environment in 1993.