968 resultados para 4-component gaussian basis sets


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioanalytical data from a bioequivalence study were used to develop limited-sampling strategy (LSS) models for estimating the area under the plasma concentration versus time curve (AUC) and the peak plasma concentration (Cmax) of 4-methylaminoantipyrine (MAA), an active metabolite of dipyrone. Twelve healthy adult male volunteers received single 600 mg oral doses of dipyrone in two formulations at a 7-day interval in a randomized, crossover protocol. Plasma concentrations of MAA (N = 336), measured by HPLC, were used to develop LSS models. Linear regression analysis and a "jack-knife" validation procedure revealed that the AUC0-¥ and the Cmax of MAA can be accurately predicted (R²>0.95, bias <1.5%, precision between 3.1 and 8.3%) by LSS models based on two sampling times. Validation tests indicate that the most informative 2-point LSS models developed for one formulation provide good estimates (R²>0.85) of the AUC0-¥ or Cmax for the other formulation. LSS models based on three sampling points (1.5, 4 and 24 h), but using different coefficients for AUC0-¥ and Cmax, predicted the individual values of both parameters for the enrolled volunteers (R²>0.88, bias = -0.65 and -0.37%, precision = 4.3 and 7.4%) as well as for plasma concentration data sets generated by simulation (R²>0.88, bias = -1.9 and 8.5%, precision = 5.2 and 8.7%). Bioequivalence assessment of the dipyrone formulations based on the 90% confidence interval of log-transformed AUC0-¥ and Cmax provided similar results when either the best-estimated or the LSS-derived metrics were used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phonological loop is a component of the working memory system specifically involved in the processing and manipulation of limited amounts of information of a sound-based phonological nature. Phonological memory can be assessed by the Children's Test of Nonword Repetition (CNRep) in English speakers but not in Portuguese speakers due to phonotactic differences between the two languages. The objectives of the present study were: 1) to develop the Brazilian Children's Test of Pseudoword Repetition (BCPR), a Portuguese version of the CNRep, and 2) to validate the BCPR by correlation with the Auditory Digit Span Test from the Stanford-Binet Intelligence Scale. The BCPR and Digit Span were assessed in 182 children aged 4-10 years, 84 from Minas Gerais State (42 from a rural region) and 98 from the city of São Paulo. There are subject age and word length effects causing repetition accuracy to decline as a function of the number of syllables of the pseudowords. Correlations between BCPR and Digit Span forward (r = 0.50; P <= 0.01) and backward (r = 0.43; P <= 0.01) were found, and partial correlation indicated that higher BCPR scores were associated with higher Digit Span scores. BCPR appears to depend more on schooling, while Digit Span was more related to development. The results demonstrate that the BCPR is a reliable measure of phonological working memory, similar to the CNRep.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the present study was to determine the vulnerability of women in prison to HIV infection. The study was carried out from August to October 2000 in a São Paulo State Penitentiary, where 299 female prisoners were serving time. We interviewed and obtained a blood sample from 290 females who agreed to enter the study. Sera were tested for the presence of antibodies to HIV, hepatitis C virus (HCV) and syphilis and the odds ratio (OR) was calculated for variables related to HIV positivity on the basis of a questionnaire. The overall prevalence data were: 13.9% for HIV (37 of 267), 22.8% for syphilis (66 of 290), and 16.2% for HCV (47 of 290). Sexual partnership variables were significantly related to HIV infection. These included HIV-positive partners (OR = 7.36, P = 0.0001), casual partners (OR = 8.96, P = 0.009), injectable drug user partners (OR = 4.7, P = 0.0001), and history of sexually transmitted disease (OR = 2.07, P = 0.05). In addition, a relationship was detected between HIV infection and drug use (OR = 2.48, P = 0.04) and injectable drug use (OR = 4.2, P = 0.002). Even women with only one partner presented a significant OR for HIV infection (OR = 2.57, P = 0.009), reflecting their vulnerability due to their trust in their partner, who did not use a condom. Although the use of injectable substances is associated with HIV infection, our results point to sexual behavior as the most important component of HIV transmission in the female prisoner population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is a worldwide leading cause of death. The standard method for evaluating critical partial occlusions is coronary arteriography, a catheterization technique which is invasive, time consuming, and costly. There are noninvasive approaches for the early detection of CAD. The basis for the noninvasive diagnosis of CAD has been laid in a sequential analysis of the risk factors, and the results of the treadmill test and myocardial perfusion scintigraphy (MPS). Many investigators have demonstrated that the diagnostic applications of MPS are appropriate for patients who have an intermediate likelihood of disease. Although this information is useful, it is only partially utilized in clinical practice due to the difficulty to properly classify the patients. Since the seminal work of Lotfi Zadeh, fuzzy logic has been applied in numerous areas. In the present study, we proposed and tested a model to select patients for MPS based on fuzzy sets theory. A group of 1053 patients was used to develop the model and another group of 1045 patients was used to test it. Receiver operating characteristic curves were used to compare the performance of the fuzzy model against expert physician opinions, and showed that the performance of the fuzzy model was equal or superior to that of the physicians. Therefore, we conclude that the fuzzy model could be a useful tool to assist the general practitioner in the selection of patients for MPS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new scientometric indicator, the h-index, has been recently proposed (Hirsch JE. Proc Natl Acad Sci 2005; 102: 16569-16572). The index avoids some shortcomings of the calculation of the total number of citations as a parameter to evaluate scientific performance. Although it has become known only recently, it has had widespread acceptance. A comparison of the average h-index of members of the Brazilian Academy of Sciences (BAS) and of the National Academy of Sciences of the USA (NAS-USA) was carried out for 10 different areas of science. Although, as expected, the comparison was unfavorable to the members of the BAS, the imbalance was distinct in different areas. Since these two academies represent, to a significant extent, the science of top quality produced in each country, the comparison allows the identification of the areas in Brazil that are closer to the international stakeholders of scientific excellence. The areas of Physics and Mathematics stand out in this context. The heterogeneity of the h-index in the different areas, estimated by the median dispersion of the index, is significantly higher in the BAS than in the NAS-USA. No elements have been collected in the present study to provide an explanation for this fact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genes encoding lipoproteins LipL32, LipL41 and the outer-membrane protein OmpL1 of leptospira were recombined and cloned into a pVAX1 plasmid. BALB/c mice were immunized with LipL32 and recombined LipL32-41-OmpL1 using DNA-DNA, DNA-protein and protein-protein strategies, respectively. Prime immunization was on day 1, boost immunizations were on day 11 and day 21. Sera were collected from each mouse on day 35 for antibody, cytokine detection and microscopic agglutination test while spleen cells were collected for splenocyte proliferation assay. All experimental groups (N = 10 mice per group) showed statistically significant increases in antigen-specific antibodies, in cytokines IL-4 and IL-10, as well as in the microscopic agglutination test and splenocyte proliferation compared with the pVAX1 control group. The groups receiving the recombined LipL32-41-OmpL1 vaccine induced anti-LipL41 and anti-OmpL1 antibodies and yielded better splenocyte proliferation values than the groups receiving LipL32. DNA prime and protein boost immune strategies stimulated more antibodies than a DNA-DNA immune strategy and yielded greater cytokine and splenocyte proliferation than a protein-protein immune strategy. It is clear from these results that recombination of protective antigen genes lipL32, lipL41, and ompL1 and a DNA-protein immune strategy resulted in better immune responses against leptospira than single-component, LipL32, or single DNA or protein immunization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oxygen therapy is essential for the treatment of some neonatal critical care conditions but its extrapulmonary effects have not been adequately investigated. We therefore studied the effects of various oxygen concentrations on intestinal epithelial cell function. In order to assess the effects of hyperoxia on the intestinal immunological barrier, we studied two physiological changes in neonatal rats exposed to hyperoxia: the change in intestinal IgA secretory component (SC, an important component of SIgA) and changes in intestinal epithelial cells. Immunohistochemistry and Western blot were used to detect changes in the intestinal tissue SC of neonatal rats. To detect intestinal epithelial cell growth, cells were counted, and 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) and Giemsa staining were used to assess cell survival. Immunohistochemistry was used to determine SC expression. The expression of intestinal SC in neonatal rats under hyperoxic conditions was notably increased compared with rats inhaling room air (P < 0.01). In vitro, 40% O2 was beneficial for cell growth. However, 60% O2 and 90% O2 induced rapid cell death. Also, 40% O2 induced expression of SC by intestinal epithelial cells, whereas 60% O2did not; however, 90% O2 limited the ability of intestinal epithelial cells to express SC. In vivo and in vitro, moderate hyperoxia brought about increases in intestinal SC. This would be expected to bring about an increase in intestinal SIgA. High levels of SC and SIgA would serve to benefit hyperoxia-exposed individuals by helping to maintain optimal conditions in the intestinal tract.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally metacognition has been theorised, methodologically studied and empirically tested from the standpoint mainly of individuals and their learning contexts. In this dissertation the emergence of metacognition is analysed more broadly. The aim of the dissertation was to explore socially shared metacognitive regulation (SSMR) as part of collaborative learning processes taking place in student dyads and small learning groups. The specific aims were to extend the concept of individual metacognition to SSMR, to develop methods to capture and analyse SSMR and to validate the usefulness of the concept of SSMR in two different learning contexts; in face-to-face student dyads solving mathematical word problems and also in small groups taking part in inquiry-based science learning in an asynchronous computer-supported collaborative learning (CSCL) environment. This dissertation is comprised of four studies. In Study I, the main aim was to explore if and how metacognition emerges during problem solving in student dyads and then to develop a method for analysing the social level of awareness, monitoring, and regulatory processes emerging during the problem solving. Two dyads comprised of 10-year-old students who were high-achieving especially in mathematical word problem solving and reading comprehension were involved in the study. An in-depth case analysis was conducted. Data consisted of over 16 (30–45 minutes) videotaped and transcribed face-to-face sessions. The dyads solved altogether 151 mathematical word problems of different difficulty levels in a game-format learning environment. The interaction flowchart was used in the analysis to uncover socially shared metacognition. Interviews (also stimulated recall interviews) were conducted in order to obtain further information about socially shared metacognition. The findings showed the emergence of metacognition in a collaborative learning context in a way that cannot solely be explained by individual conception. The concept of socially-shared metacognition (SSMR) was proposed. The results highlighted the emergence of socially shared metacognition specifically in problems where dyads encountered challenges. Small verbal and nonverbal signals between students also triggered the emergence of socially shared metacognition. Additionally, one dyad implemented a system whereby they shared metacognitive regulation based on their strengths in learning. Overall, the findings suggested that in order to discover patterns of socially shared metacognition, it is important to investigate metacognition over time. However, it was concluded that more research on socially shared metacognition, from larger data sets, is needed. These findings formed the basis of the second study. In Study II, the specific aim was to investigate whether socially shared metacognition can be reliably identified from a large dataset of collaborative face-to-face mathematical word problem solving sessions by student dyads. We specifically examined different difficulty levels of tasks as well as the function and focus of socially shared metacognition. Furthermore, the presence of observable metacognitive experiences at the beginning of socially shared metacognition was explored. Four dyads participated in the study. Each dyad was comprised of high-achieving 10-year-old students, ranked in the top 11% of their fourth grade peers (n=393). Dyads were from the same data set as in Study I. The dyads worked face-to-face in a computer-supported, game-format learning environment. Problem-solving processes for 251 tasks at three difficulty levels taking place during 56 (30–45 minutes) lessons were video-taped and analysed. Baseline data for this study were 14 675 turns of transcribed verbal and nonverbal behaviours observed in four study dyads. The micro-level analysis illustrated how participants moved between different channels of communication (individual and interpersonal). The unit of analysis was a set of turns, referred to as an ‘episode’. The results indicated that socially shared metacognition and its function and focus, as well as the appearance of metacognitive experiences can be defined in a reliable way from a larger data set by independent coders. A comparison of the different difficulty levels of the problems suggested that in order to trigger socially shared metacognition in small groups, the problems should be more difficult, as opposed to moderately difficult or easy. Although socially shared metacognition was found in collaborative face-to-face problem solving among high-achieving student dyads, more research is needed in different contexts. This consideration created the basis of the research on socially shared metacognition in Studies III and IV. In Study III, the aim was to expand the research on SSMR from face-to-face mathematical problem solving in student dyads to inquiry-based science learning among small groups in an asynchronous computer-supported collaborative learning (CSCL) environment. The specific aims were to investigate SSMR’s evolvement and functions in a CSCL environment and to explore how SSMR emerges at different phases of the inquiry process. Finally, individual student participation in SSMR during the process was studied. An in-depth explanatory case study of one small group of four girls aged 12 years was carried out. The girls attended a class that has an entrance examination and conducts a language-enriched curriculum. The small group solved complex science problems in an asynchronous CSCL environment, participating in research-like processes of inquiry during 22 lessons (á 45–minute). Students’ network discussion were recorded in written notes (N=640) which were used as study data. A set of notes, referred to here as a ‘thread’, was used as the unit of analysis. The inter-coder agreement was regarded as substantial. The results indicated that SSMR emerges in a small group’s asynchronous CSCL inquiry process in the science domain. Hence, the results of Study III were in line with the previous Study I and Study II and revealed that metacognition cannot be reduced to the individual level alone. The findings also confirm that SSMR should be examined as a process, since SSMR can evolve during different phases and that different SSMR threads overlapped and intertwined. Although the classification of SSMR’s functions was applicable in the context of CSCL in a small group, the dominant function was different in the asynchronous CSCL inquiry in the small group in a science activity than in mathematical word problem solving among student dyads (Study II). Further, the use of different analytical methods provided complementary findings about students’ participation in SSMR. The findings suggest that it is not enough to code just a single written note or simply to examine who has the largest number of notes in the SSMR thread but also to examine the connections between the notes. As the findings of the present study are based on an in-depth analysis of a single small group, further cases were examined in Study IV, as well as looking at the SSMR’s focus, which was also studied in a face-to-face context. In Study IV, the general aim was to investigate the emergence of SSMR with a larger data set from an asynchronous CSCL inquiry process in small student groups carrying out science activities. The specific aims were to study the emergence of SSMR in the different phases of the process, students’ participation in SSMR, and the relation of SSMR’s focus to the quality of outcomes, which was not explored in previous studies. The participants were 12-year-old students from the same class as in Study III. Five small groups consisting of four students and one of five students (N=25) were involved in the study. The small groups solved ill-defined science problems in an asynchronous CSCL environment, participating in research-like processes of inquiry over a total period of 22 hours. Written notes (N=4088) detailed the network discussions of the small groups and these constituted the study data. With these notes, SSMR threads were explored. As in Study III, the thread was used as the unit of analysis. In total, 332 notes were classified as forming 41 SSMR threads. Inter-coder agreement was assessed by three coders in the different phases of the analysis and found to be reliable. Multiple methods of analysis were used. Results showed that SSMR emerged in all the asynchronous CSCL inquiry processes in the small groups. However, the findings did not reveal any significantly changing trend in the emergence of SSMR during the process. As a main trend, the number of notes included in SSMR threads differed significantly in different phases of the process and small groups differed from each other. Although student participation was seen as highly dispersed between the students, there were differences between students and small groups. Furthermore, the findings indicated that the amount of SSMR during the process or participation structure did not explain the differences in the quality of outcomes for the groups. Rather, when SSMRs were focused on understanding and procedural matters, it was associated with achieving high quality learning outcomes. In turn, when SSMRs were focused on incidental and procedural matters, it was associated with low level learning outcomes. Hence, the findings imply that the focus of any emerging SSMR is crucial to the quality of the learning outcomes. Moreover, the findings encourage the use of multiple research methods for studying SSMR. In total, the four studies convincingly indicate that a phenomenon of socially shared metacognitive regulation also exists. This means that it was possible to define the concept of SSMR theoretically, to investigate it methodologically and to validate it empirically in two different learning contexts across dyads and small groups. In-depth micro-level case analysis in Studies I and III showed the possibility to capture and analyse in detail SSMR during the collaborative process, while in Studies II and IV, the analysis validated the emergence of SSMR in larger data sets. Hence, validation was tested both between two environments and within the same environments with further cases. As a part of this dissertation, SSMR’s detailed functions and foci were revealed. Moreover, the findings showed the important role of observable metacognitive experiences as the starting point of SSMRs. It was apparent that problems dealt with by the groups should be rather difficult if SSMR is to be made clearly visible. Further, individual students’ participation was found to differ between students and groups. The multiple research methods employed revealed supplementary findings regarding SSMR. Finally, when SSMR was focused on understanding and procedural matters, this was seen to lead to higher quality learning outcomes. Socially shared metacognition regulation should therefore be taken into consideration in students’ collaborative learning at school similarly to how an individual’s metacognition is taken into account in individual learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular oxygen (O2) is a key component in cellular respiration and aerobic life. Through the redox potential of O2, the amount of free energy available to organisms that utilize it is greatly increased. Yet, due to the nature of the O2 electron configuration, it is non-reactive to most organic molecules in the ground state. For O2 to react with most organic compounds it must be activated. By activating O2, oxygenases can catalyze reactions involving oxygen incorporation into organic compounds. The oxygen activation mechanisms employed by many oxygenases to have been studied, and they often include transition metals and selected organic compounds. Despite the diversity of mechanisms for O2 activation explored in this thesis, all of the monooxygenases studied in the experimental part activate O2 through a transient carbanion intermediate. One of these enzymes is the small cofactorless monooxygenase SnoaB. Cofactorless monooxygenases are unusual oxygenases that require neither transition metals nor cofactors to activate oxygen. Based on our biochemical characterization and the crystal structure of this enzyme, the mechanism most likely employed by SnoaB relies on a carbanion intermediate to activate oxygen, which is consistent with the proposed substrate-assisted mechanism for this family of enzymes. From the studies conducted on the two-component system AlnT and AlnH, both the functions of the NADH-dependent flavin reductase, AlnH, and the reduced flavin dependent monooxygenase, AlnT, were confirmed. The unusual regiochemistry proposed for AlnT was also confirmed on the basis of the structure of a reaction product. The mechanism of AlnT, as with other flavin-dependent monooxygenases, is likely to involve a caged radical pair consisting of a superoxide anion and a neutral flavin radical formed from an initial carbanion intermediate. In the studies concerning the engineering of the S-adenosyl-L-methionine (SAM) dependent 4-O-methylase DnrK and the homologous atypical 10-hydroxylase RdmB, our data suggest that an initial decarboxylation of the substrate is catalyzed by both of these enzymes, which results in the generation of a carbanion intermediate. This intermediate is not essential for the 4-O-methylation reaction, but it is important for the 10-hydroxylation reaction, since it enables substrate-assisted activation of molecular oxygen involving a single electron transfer to O2 from a carbanion intermediate. The only role for SAM in the hydroxylation reaction is likely to be stabilization of the carbanion through the positive charge of the cofactor. Based on the DnrK variant crystal structure and the characterizations of several DnrK variants, the insertion of a single amino acid in DnrK (S297) is sufficient for gaining a hydroxylation function, which is likely caused by carbanion stabilization through active site solvent restriction. Despite large differences in the three-dimensional structures of the oxygenases and the potential for multiple oxygen activation mechanisms, all the enzymes in my studies rely on carbanion intermediates to activate oxygen from either flavins or their substrates. This thesis provides interesting examples of divergent evolution and the prevalence of carbanion intermediates within polyketide biosynthesis. This mechanism appears to be recurrent in aromatic polyketide biosynthesis and may reflect the acidic nature of these compounds, propensity towards hydrogen bonding and their ability to delocalize π-electrons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrochromism, the phenomenon of reversible color change induced by a small electric charge, forms the basis for operation of several devices including mirrors, displays and smart windows. Although, the history of electrochromism dates back to the 19th century, only the last quarter of the 20th century has its considerable scientific and technological impact. The commercial applications of electrochromics (ECs) are rather limited, besides top selling EC anti-glare mirrors by Gentex Corporation and airplane windows by Boeing, which made a huge commercial success and exposed the potential of EC materials for future glass industry. It is evident from their patents that viologens (salts of 4,4ʹ-bipyridilium) were the major active EC component for most of these marketed devices, signifying the motivation of this thesis focusing on EC viologens. Among the family of electrochromes, viologens have been utilized in electrochromic devices (ECDs) for a while, due to its intensely colored radical cation formation induced by applying a small cathodic potential. Viologens can be synthesized as oligomer or in the polymeric form or as functionality to conjugated polymers. In this thesis, polyviologens (PVs) were synthesized starting from cyanopyridinium (CNP) based monomer precursors. Reductive coupling of cross-connected cyano groups yields viologen and polyviologen under successive electropolymerization using for example the cyclic voltammetry (CV) technique. For further development, a polyviologen-graphene composite system was fabricated, focusing at the stability of the PV electrochrome without sacrificing its excellent EC properties. High electrical conductivity, high surface area offered by graphene sheets together with its non-covalent interactions and synergism with PV significantly improved the electrochrome durability in the composite matrix. The work thereby continued in developing a CNP functionalized thiophene derivative and its copolymer for possible utilization of viologen in the copolymer blend. Furthermore, the viologen functionalized thiophene derivative was synthesized and electropolymerized in order to explore enhancement in the EC contrast and overall EC performance. The findings suggest that such electroactive viologen/polyviologen systems and their nanostructured composite films as well as viologen functionalized conjugated polymers, can be potentially applied as an active EC material in future ECDs aiming at durable device performances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In ''Nietzsche, Genealogy, History," Foucault suggests that genealogy is a sort of "curative science." The genealogist must be a physiologist and a pathologist as well as an historian, for his task is to decipher the marks that power relations and historical events leave on the subjugated body; "he must be able to diagnose the illnesses of the body, its conditions of weakness and strength, its breakdowns and resistances, to be in a position to judge philosophical discourse." But this claim seems to be incongruent with another major task of genealogy. After all, genealogy is supposed to show us that the things we take to be absolute are in fact discontinuous and historically situated: "Nothing in man-not even his body-is sufficiently stable to serve as the basis for self-recognition or for understanding other men." If this is true, then the subjugated body can never be restored to a healthy state because it has no essential or original nature. There are no universal standards by which we can even distinguish between healthy and unhealthy bodies. So in what sense is genealogy to be a "curative science"? In my thesis, I try to elucidate the complex relationship between genealogy and the body. I argue that genealogy can be a curative science even while it "multiplies our body and sets it against itself." Ifwe place a special emphasis on the role that transgression plays in Foucault's genealogical works, then the healthy body is precisely the body that resists universal standards and classifications. If genealogy is to be a curative science, then it must restore to the subjugated body an "identity" that transgresses its own limits and that constitutes itself, paradoxically, in the very effacement of identity. In the first chapter of my thesis, I examine the body's role as "surface of the inscription of events." Power relations inscribe on and around the body an identity or subjectivity that appears to be unified and universal, but which is in fact disparate and historically situated. The "subjected" body is the sick and pathologically weak body. In Chapters 2 and 3, I describe how it is possible for the unhealthy body to become healthy by resisting the subjectivity that has been inscribed upon it. Chapter 4 explains how Foucault's later works fit into this characterization of genealogy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Addition of L-glutamate caused alkalinization of the medium surrounding Asparagus spreng.ri mesophyll cells. This suggests a H+/L-glutmate symport uptake system for L-glutamate. However stoichiometries of H+/L-glutamate symport into Asparagus cells were much higher than those in other plant systems. Medium alkalinization may also result from a metabolic decarboxylation process. Since L-glutmate is decarboxylated to r-amino butyric acid (SABA) in this system, the origin of medium alkalinization was reconsidered. Suspensions of mechanically isolated and photosyntheically competent Asparagus sprengeri mesophyll cells were used to investigate the H+/L-glutamate symport system, SABA production, GABA transport, and the origin of L-glutamate dependent medium alkalinization. The major results obtained are summarized as follows: 1. L-Glutamate and GABA were the second or third most abundant amino acids in these cells. Cellular concentrations of L-glutamate were 1.09 mM and 1.31 mM in the light and dark, respectively. Those of SABA were 1.23 mM and 1.17 mM in the light and dark, respectively. 2. Asparagine was the most abundant amino acid in xylem sap and comprised 54 to 68 1. of the amino acid pool on a molar basis. GABA was the second most abundant amino acid and represented 10 to 11 1. of the amino acid pool. L-Slutamate was a minor component. 3. A 10 minute incubation with 1 mM L-glutamate increased the production of GABA in the medium by 2,743 7. and 2,241 7. in the light and dark, respectively. 4. L-Glutamate entered the cells prior to decarboxylation. 5. There was no evidence for a H+/GABA symport process • 6. GABA was produced by loss of carbon-1 of L-glutamate. 7. The specific activity of newly synthesized labeled GABA suggests that it is not equilibrated with a storage pool of GABA. 8. The mechanism of GABA efflux appears to be a passive process. 9. The evidence indicates that the origin of L-glutamate dependent medium alkalinization is a H+/L-glutamate symport not an extracellular decarboxylation. The possible role of GABA production in regulating cytoplasmic pH and L-glutamate levels during rapid electrogenic H+/L-glutamate symport is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initial timing of face-specific effects in event-related potentials (ERPs) is a point of contention in face processing research. Although effects during the time of the N170 are robust in the literature, inconsistent effects during the time of the P100 challenge the interpretation of the N170 as being the initial face-specific ERP effect. The interpretation of the early P100 effects are often attributed to low-level differences between face stimuli and a host of other image categories. Research using sophisticated controls for low-level stimulus characteristics (Rousselet, Husk, Bennett, & Sekuler, 2008) report robust face effects starting at around 130 ms following stimulus onset. The present study examines the independent components (ICs) of the P100 and N170 complex in the context of a minimally controlled low-level stimulus set and a clear P100 effect for faces versus houses at the scalp. Results indicate that four ICs account for the ERPs to faces and houses in the first 200ms following stimulus onset. The IC that accounts for the majority of the scalp N170 (icNla) begins dissociating stimulus conditions at approximately 130 ms, closely replicating the scalp results of Rousselet et al. (2008). The scalp effects at the time of the P100 are accounted for by two constituent ICs (icP1a and icP1b). The IC that projects the greatest voltage at the scalp during the P100 (icP1a) shows a face-minus-house effect over the period of the P100 that is less robust than the N 170 effect of icN 1 a when measured as the average of single subject differential activation robustness. The second constituent process of the P100 (icP1b), although projecting a smaller voltage to the scalp than icP1a, shows a more robust effect for the face-minus-house contrast starting prior to 100 ms following stimulus onset. Further, the effect expressed by icP1 b takes the form of a larger negative projection to medial occipital sites for houses over faces partially canceling the larger projection of icP1a, thereby enhancing the face positivity at this time. These findings have three main implications for ERP research on face processing: First, the ICs that constitute the face-minus-house P100 effect are independent from the ICs that constitute the N170 effect. This suggests that the P100 effect and the N170 effect are anatomically independent. Second, the timing of the N170 effect can be recovered from scalp ERPs that have spatio-temporally overlapping effects possibly associated with low-level stimulus characteristics. This unmixing of the EEG signals may reduce the need for highly constrained stimulus sets, a characteristic that is not always desirable for a topic that is highly coupled to ecological validity. Third, by unmixing the constituent processes of the EEG signals new analysis strategies are made available. In particular the exploration of the relationship between cortical processes over the period of the P100 and N170 ERP complex (and beyond) may provide previously unaccessible answers to questions such as: Is the face effect a special relationship between low-level and high-level processes along the visual stream?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four questions dominate normative contemporary constitutional theroy: What is the purpose of a constitution? What makes a constitution legitimate? What kinds of arguments are legitimate within the process of constitutional interpretation? What can make judicial review of legislation legitimate in principle? The main purpose of this text is to provide one general answer to the last question. The secondary purpose is to show how this answer may bear upon our understanding of the fundamental basis of constitutional law. These two purposes should suggest particular answers to the first three questions.