949 resultados para event based


Relevância:

30.00% 30.00%

Publicador:

Resumo:

On December 4th 2007, a 3-Mm3 landslide occurred along the northwestern shore of Chehalis Lake. The initiation zone is located at the intersection of the main valley slope and the northern sidewall of a prominent gully. The slope failure caused a displacement wave that ran up to 38 m on the opposite shore of the lake. The landslide is temporally associated with a rain-on-snow meteorological event which is thought to have triggered it. This paper describes the Chehalis Lake landslide and presents a comparison of discontinuity orientation datasets obtained using three techniques: field measurements, terrestrial photogrammetric 3D models and an airborne LiDAR digital elevation model to describe the orientation and characteristics of the five discontinuity sets present. The discontinuity orientation data are used to perform kinematic, surface wedge limit equilibrium and three-dimensional distinct element analyses. The kinematic and surface wedge analyses suggest that the location of the slope failure (intersection of the valley slope and a gully wall) has facilitated the development of the unstable rock mass which initiated as a planar sliding failure. Results from the three-dimensional distinct element analyses suggest that the presence, orientation and high persistence of a discontinuity set dipping obliquely to the slope were critical to the development of the landslide and led to a failure mechanism dominated by planar sliding. The three-dimensional distinct element modelling also suggests that the presence of a steeply dipping discontinuity set striking perpendicular to the slope and associated with a fault exerted a significant control on the volume and extent of the failed rock mass but not on the overall stability of the slope.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Quantitative Methodologies in Policy and Practice for Child Health and Wellbeing Summer School is organised by the Children’s Research Network for Ireland and Northern Ireland in conjunction with TCD School of Nursing & Midwifery. The Children’s Research Network for Ireland and Northern Ireland is a not for profit membership-based organization which supports the research community to better understand and improve the lives of children and young people, by creating and maintaining an inclusive, independent network through which  information, knowledge, experience, learning and skills can be shared. Membership to the network facilitates access to workshops, summer schools, and events focused on children’s research. The Summer School is funded by the Department of Children and Youth Affairs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Studies of diffuse large B-cell lymphoma (DLBCL) are typically evaluated by using a time-to-event approach with relapse, re-treatment, and death commonly used as the events. We evaluated the timing and type of events in newly diagnosed DLBCL and compared patient outcome with reference population data. PATIENTS AND METHODS: Patients with newly diagnosed DLBCL treated with immunochemotherapy were prospectively enrolled onto the University of Iowa/Mayo Clinic Specialized Program of Research Excellence Molecular Epidemiology Resource (MER) and the North Central Cancer Treatment Group NCCTG-N0489 clinical trial from 2002 to 2009. Patient outcomes were evaluated at diagnosis and in the subsets of patients achieving event-free status at 12 months (EFS12) and 24 months (EFS24) from diagnosis. Overall survival was compared with age- and sex-matched population data. Results were replicated in an external validation cohort from the Groupe d'Etude des Lymphomes de l'Adulte (GELA) Lymphome Non Hodgkinien 2003 (LNH2003) program and a registry based in Lyon, France. RESULTS: In all, 767 patients with newly diagnosed DLBCL who had a median age of 63 years were enrolled onto the MER and NCCTG studies. At a median follow-up of 60 months (range, 8 to 116 months), 299 patients had an event and 210 patients had died. Patients achieving EFS24 had an overall survival equivalent to that of the age- and sex-matched general population (standardized mortality ratio [SMR], 1.18; P = .25). This result was confirmed in 820 patients from the GELA study and registry in Lyon (SMR, 1.09; P = .71). Simulation studies showed that EFS24 has comparable power to continuous EFS when evaluating clinical trials in DLBCL. CONCLUSION: Patients with DLBCL who achieve EFS24 have a subsequent overall survival equivalent to that of the age- and sex-matched general population. EFS24 will be useful in patient counseling and should be considered as an end point for future studies of newly diagnosed DLBCL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the Early Toarcian, major paleoenvironnemental and paleoceanographical changes occurred, leading to an oceanic anoxic event (OAE) and to a perturbation of the carbon isotope cycle. Although the standard biochronology of the Lower Jurassic is essentially based upon ammonites, in recent years biostratigraphy based on calcareous nannofossils and dinoflagellate cysts is increasingly used to date Jurassic rocks. However, the precise dating and correlation of the Early Toarcian OAE, and of the associated delta C-13 anomaly in different settings of the western Tethys, are still partly problematic, and it is still unclear whether these events are synchronous or not. In order to allow more accurate correlations of the organic rich levels recorded in the Lower Toarcian OAE, this account proposes a new biozonation based on a quantitative biochronology approach, the Unitary Associations (UA), applied to calcareous nannofossils. This study represents the first attempt to apply the UA method to Jurassic nannofossils. The study incorporates eighteen sections distributed across western Tethys and ranging from the Pliensbachian to Aalenian, comprising 1220 samples and 72 calcareous nannofossil taxa. The BioGraph [Savary, J., Guex, J., 1999. Discrete biochronological scales and unitary associations: description of the Biograph Computer program. Memoires de Geologie de Lausanne 34, 282 pp] and UA-Graph (Copyright Hammer O., Guex and Savary, 2002) softwares provide a discrete biochronological framework based upon multi-taxa concurrent range zones in the different sections. The optimized dataset generates nine UAs using the co-occurrences of 56 taxa. These UAs are grouped into six Unitary Association Zones (UA-Z), which constitute a robust biostratigraphic synthesis of all the observed or deduced biostratigraphic relationships between the analysed taxa. The UA zonation proposed here is compared to ``classic'' calcareous nannofossil biozonations, which are commonly used for the southern and the northern sides of Tethys. The biostratigraphic resolution of the UA-Zones varies from one nannofossil subzone or part of it to several subzones, and can be related to the pattern of calcareous nannoplankton originations and extinctions during the studied time interval. The Late Pliensbachian - Early Toarcian interval (corresponding to the UA-Z II) represents a major step in the Jurassic nannoplankton radiation. The recognized UA-Zones are also compared to the carbon isotopic negative excursion and TOC maximum in five sections of central Italy, Germany and England, with the aim of providing a more reliable correlation tool for the Early Toarcian OAE, and of the associated isotopic anomaly, between the southern and northern part of western Tethys. The results of this work show that the TOC maximum and delta C-13 negative excursion correspond to the upper part of the UA-Z II (i.e., UA 3) in the sections analysed. This suggests that the Early Toarcian OAE was a synchronous event within the western Tethys. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The study of the attentional system remains a challenge for current neuroscience. The "Attention Network Test" (ANT) was designed to study simultaneously three different attentional networks (alerting, orienting, and executive) based in subtraction of different experimental conditions. However, some studies recommend caution with these calculations due to the interactions between the attentional networks. In particular, it is highly relevant that several interpretations about attentional impairment have arisen from these calculations in diverse pathologies. Event related potentials (ERPs) and neural source analysis can be applied to disentangle the relationships between these attentional networks not specifically shown by behavioral measures. RESULTS This study shows that there is a basic level of alerting (tonic alerting) in the no cue (NC) condition, represented by a slow negative trend in the ERP trace prior to the onset of the target stimuli. A progressive increase in the CNV amplitude related to the amount of information provided by the cue conditions is also shown. Neural source analysis reveals specific modulations of the CNV related to a task-related expectancy presented in the NC condition; a late modulation triggered by the central cue (CC) condition and probably representing a generic motor preparation; and an early and late modulation for spatial cue (SC) condition suggesting specific motor and sensory preactivation. Finally, the first component in the information processing of the target stimuli modulated by the interaction between orienting network and the executive system can be represented by N1. CONCLUSIONS The ANT is useful as a paradigm to study specific attentional mechanisms and their interactions. However, calculation of network effects is based in subtractions with non-comparable experimental conditions, as evidenced by the present data, which can induce misinterpretations in the study of the attentional capacity in human subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are various methods to collect adverse events (AEs) in clinical trials. The methods how AEs are collected in vaccine trials is of special interest: solicited reporting can lead to over-reporting events that have little or no biological relationship to the vaccine. We assessed the rate of AEs listed in the package insert for the virosomal hepatitis A vaccine Epaxal(®), comparing data collected by solicited or unsolicited self-reporting. In an open, multi-centre post-marketing study, 2675 healthy travellers received single doses of vaccine administered intramuscularly. AEs were recorded based on solicited and unsolicited questioning during a four-day period after vaccination. A total of 2541 questionnaires could be evaluated (95.0% return rate). Solicited self-reporting resulted in significantly higher (p<0.0001) rates of subjects with AEs than unsolicited reporting, both at baseline (18.9% solicited versus 2.1% unsolicited systemic AEs) and following immunization (29.6% versus 19.3% local AEs; 33.8% versus 18.2% systemic AEs). This could indicate that actual reporting rates of AEs with Epaxal(®) may be substantially lower than described in the package insert. The distribution of AEs differed significantly between the applied methods of collecting AEs. The most common AEs listed in the package insert were reported almost exclusively with solicited questioning. The reporting of local AEs was more likely than that of systemic AEs to be influenced by subjects' sex, age and study centre. Women reported higher rates of AEs than men. The results highlight the need for detailing the methods how vaccine tolerability was reported and assessed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthquakes occurring around the world each year cause thousands ofdeaths, millions of dollars in damage to infrastructure, and incalculablehuman suffering. In recent years, satellite technology has been asignificant boon to response efforts following an earthquake and itsafter-effects by providing mobile communications between response teamsand remote sensing of damaged areas to disaster management organizations.In 2007, an international team of students and professionals assembledduring theInternational Space University’s Summer Session Program in Beijing, Chinato examine how satellite and ground-based technology could be betterintegrated to provide an optimised response in the event of an earthquake.The resulting Technology Resources for Earthquake MOnitoring and Response(TREMOR) proposal describes an integrative prototype response system thatwill implement mobile satellite communication hubs providing telephone anddata links between response teams, onsite telemedicine consultation foremergency first-responders, and satellite navigation systems that willlocate and track emergency vehicles and guide search-and-rescue crews. Aprototype earthquake simulation system is also proposed, integratinghistorical data, earthquake precursor data, and local geomatics andinfrastructure information to predict the damage that could occur in theevent of an earthquake. The backbone of these proposals is a comprehensiveeducation and training program to help individuals, communities andgovernments prepare in advance. The TREMOR team recommends thecoordination of these efforts through a centralised, non-governmentalorganization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Sexual selection theory posits that ornaments can signal the genetic quality of an individual. Eumelanin-based coloration is such an ornament and can signal the ability to cope with a physiological stress response because the melanocortin system regulates eumelanogenesis as well as physiological stress responses. In the present article, we experimentally investigated whether the stronger stress sensitivity of light than dark eumelanic individuals stems from differential regulation of stress hormones. Our study shows that darker eumelanic barn owl nestlings have a lower corticosterone release after a stressful event, an association, which was also inherited from the mother (but not the father) to the offspring. Additionally, nestlings sired by darker eumelanic mothers more quickly reduced experimentally elevated corticosterone levels. This provides a solution as to how ornamented individuals can be more resistant to various sources of stress than drab conspecifics. Our study suggests that eumelanin-based coloration can be a sexually selected signal of resistance to stressful events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Responses to external stimuli are typically investigated by averaging peri-stimulus electroencephalography (EEG) epochs in order to derive event-related potentials (ERPs) across the electrode montage, under the assumption that signals that are related to the external stimulus are fixed in time across trials. We demonstrate the applicability of a single-trial model based on patterns of scalp topographies (De Lucia et al, 2007) that can be used for ERP analysis at the single-subject level. The model is able to classify new trials (or groups of trials) with minimal a priori hypotheses, using information derived from a training dataset. The features used for the classification (the topography of responses and their latency) can be neurophysiologically interpreted, because a difference in scalp topography indicates a different configuration of brain generators. An above chance classification accuracy on test datasets implicitly demonstrates the suitability of this model for EEG data. Methods: The data analyzed in this study were acquired from two separate visual evoked potential (VEP) experiments. The first entailed passive presentation of checkerboard stimuli to each of the four visual quadrants (hereafter, "Checkerboard Experiment") (Plomp et al, submitted). The second entailed active discrimination of novel versus repeated line drawings of common objects (hereafter, "Priming Experiment") (Murray et al, 2004). Four subjects per experiment were analyzed, using approx. 200 trials per experimental condition. These trials were randomly separated in training (90%) and testing (10%) datasets in 10 independent shuffles. In order to perform the ERP analysis we estimated the statistical distribution of voltage topographies by a Mixture of Gaussians (MofGs), which reduces our original dataset to a small number of representative voltage topographies. We then evaluated statistically the degree of presence of these template maps across trials and whether and when this was different across experimental conditions. Based on these differences, single-trials or sets of a few single-trials were classified as belonging to one or the other experimental condition. Classification performance was assessed using the Receiver Operating Characteristic (ROC) curve. Results: For the Checkerboard Experiment contrasts entailed left vs. right visual field presentations for upper and lower quadrants, separately. The average posterior probabilities, indicating the presence of the computed template maps in time and across trials revealed significant differences starting at ~60-70 ms post-stimulus. The average ROC curve area across all four subjects was 0.80 and 0.85 for upper and lower quadrants, respectively and was in all cases significantly higher than chance (unpaired t-test, p<0.0001). In the Priming Experiment, we contrasted initial versus repeated presentations of visual object stimuli. Their posterior probabilities revealed significant differences, which started at 250ms post-stimulus onset. The classification accuracy rates with single-trial test data were at chance level. We therefore considered sub-averages based on five single trials. We found that for three out of four subjects' classification rates were significantly above chance level (unpaired t-test, p<0.0001). Conclusions: The main advantage of the present approach is that it is based on topographic features that are readily interpretable along neurophysiologic lines. As these maps were previously normalized by the overall strength of the field potential on the scalp, a change in their presence across trials and between conditions forcibly reflects a change in the underlying generator configurations. The temporal periods of statistical difference between conditions were estimated for each training dataset for ten shuffles of the data. Across the ten shuffles and in both experiments, we observed a high level of consistency in the temporal periods over which the two conditions differed. With this method we are able to analyze ERPs at the single-subject level providing a novel tool to compare normal electrophysiological responses versus single cases that cannot be considered part of any cohort of subjects. This aspect promises to have a strong impact on both basic and clinical research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present the results of experimental work on the development of lexical class-based lexica by automatic means. Our purpose is to assess the use of linguistic lexical-class based information as a feature selection methodology for the use of classifiers in quick lexical development. The results show that the approach can help reduce the human effort required in the development of language resources significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photosystem II (PSII) of oxygenic photosynthesis is susceptible to photoinhibition. Photoinhibition is defined as light induced damage resulting in turnover of the D1 protein subunit of the reaction center of PSII. Both visible and ultraviolet (UV) light cause photoinhibition. Photoinhibition induced by UV light damages the oxygen evolving complex (OEC) via absorption of UV photons by the Mn ion(s) of OEC. Under visible light, most of the earlier hypotheses assume that photoinhibition occurs when the rate of photon absorption by PSII antenna exceeds the use of the absorbed energy in photosynthesis. However, photoinhibition occurs at all light intensities with the same efficiency per photon. The aim of my thesis work was to build a model of photoinhibition that fits the experimental features of photoinhibition. I studied the role of electron transfer reactions of PSII in photoinhibition and found that changing the electron transfer rate had only minor influence on photoinhibition if light intensity was kept constant. Furthermore, quenching of antenna excitations protected less efficiently than it would protect if antenna chlorophylls were the only photoreceptors of photoinhibition. To identify photoreceptors of photoinhibition, I measured the action spectrum of photoinhibition. The action spectrum showed resemblance to the absorption spectra of Mn model compounds suggesting that the Mn cluster of OEC acts as a photoreceptor of photoinhibition under visible light, too. The role of Mn in photoinhibition was further supported by experiments showing that during photoinhibition OEC is damaged before electron transfer activity at the acceptor side of PSII is lost. Mn enzymes were found to be photosensitive under visible and UV light indicating that Mn-containing compounds, including OEC, are capable of functioning as photosensitizers both in visible and UV light. The experimental results above led to the Mn hypothesis of the mechanism of continuous-light-induced photoinhibition. According to the Mn hypothesis, excitation of Mn of OEC results in inhibition of electron donation from OEC to the oxidized primary donor P680+ both under UV and visible light. P680 is oxidized by photons absorbed by chlorophyll, and if not reduced by OEC, P680+ may cause harmful oxidation of other PSII components. Photoinhibition was also induced with intense laser pulses and it was found that the photoinhibitory efficiency increased in proportion to the square of pulse intensity suggesting that laser-pulse-induced photoinhibition is a two-photon reaction. I further developed the Mn hypothesis suggesting that the initial event in photoinhibition under both continuous and pulsed light is the same: Mn excitation that leads to the inhibition of electron donation from OEC to P680+. Under laser-pulse-illumination, another Mn-mediated inhibitory photoreaction occurs within the duration of the same pulse, whereas under continuous light, secondary damage is chlorophyll mediated. A mathematical model based on the Mn hypothesis was found to explain photoinhibition under continuous light, under flash illumination and under the combination of these two.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whether different brain networks are involved in generating unimanual responses to a simple visual stimulus presented in the ipsilateral versus contralateral hemifield remains a controversial issue. Visuo-motor routing was investigated with event-related functional magnetic resonance imaging (fMRI) using the Poffenberger reaction time task. A 2 hemifield x 2 response hand design generated the "crossed" and "uncrossed" conditions, describing the spatial relation between these factors. Both conditions, with responses executed by the left or right hand, showed a similar spatial pattern of activated areas, including striate and extrastriate areas bilaterally, SMA, and M1 contralateral to the responding hand. These results demonstrated that visual information is processed bilaterally in striate and extrastriate visual areas, even in the "uncrossed" condition. Additional analyses based on sorting data according to subjects' reaction times revealed differential crossed versus uncrossed activity only for the slowest trials, with response strength in infero-temporal cortices significantly correlating with crossed-uncrossed differences (CUD) in reaction times. Collectively, the data favor a parallel, distributed model of brain activation. The presence of interhemispheric interactions and its consequent bilateral activity is not determined by the crossed anatomic projections of the primary visual and motor pathways. Distinct visuo-motor networks need not be engaged to mediate behavioral responses for the crossed visual field/response hand condition. While anatomical connectivity heavily influences the spatial pattern of activated visuo-motor pathways, behavioral and functional parameters appear to also affect the strength and dynamics of responses within these pathways.