959 resultados para Time Domain Reflectometry
Resumo:
Abstract concepts like numbers or time are thought to be represented in the more concrete domain of space and the sensorimotor system. For example, thinking of past or future events has a physical manifestation in backward or forward body sway, respectively. In the present study, we investigated the reverse effect: can passive whole-body motion influence the processing of temporal information? Participants were asked to categorize verbal stimuli to the concepts future or past while they were displaced forward and backward (Experiment 1), or upward and downward (Experiment 2). The results showed that future related verbal stimuli were categorized faster during forward as compared to backward motion. This finding supports the view that temporal events are represented along a mental time line and that the sensorimotor system is linked to that representation. We showed that body motion is not just an epiphenomenon of temporal thoughts. Passive whole-body motion can influence higher-order temporal cognition.
Phosphorylation of the proline-rich domain of Xp95 modulates Xp95 interaction with partner proteins.
Resumo:
The mammalian adaptor protein Alix [ALG-2 (apoptosis-linked-gene-2 product)-interacting protein X] belongs to a conserved family of proteins that have in common an N-terminal Bro1 domain and a C-terminal PRD (proline-rich domain), both of which mediate partner protein interactions. Following our previous finding that Xp95, the Xenopus orthologue of Alix, undergoes a phosphorylation-dependent gel mobility shift during progesteroneinduced oocyte meiotic maturation, we explored potential regulation of Xp95/Alix by protein phosphorylation in hormone-induced cell cycle re-entry or M-phase induction. By MALDI-TOF (matrix-assisted laser-desorption ionization-time-of-flight) MS analyses and gel mobility-shift assays, Xp95 is phosphorylated at multiple sites within the N-terminal half of the PRD during Xenopus oocyte maturation, and a similar region in Alix is phosphorylated in mitotically arrested but not serum-stimulated mammalian cells. By tandem MS, Thr745 within this region, which localizes in a conserved binding site to the adaptor protein SETA [SH3 (Src homology 3) domain-containing, expressed in tumorigenic astrocytes] CIN85 (a-cyano-4-hydroxycinnamate)/SH3KBP1 (SH3-domain kinase-binding protein 1), is one of the phosphorylation sites in Xp95. Results from GST (glutathione S-transferase)-pull down and peptide binding/competition assays further demonstrate that the Thr745 phosphorylation inhibits Xp95 interaction with the second SH3 domain of SETA. However, immunoprecipitates of Xp95 from extracts of M-phase-arrested mature oocytes contained additional partner proteins as compared with immunoprecipitates from extracts of G2-arrested immature oocytes. The deubiquitinase AMSH (associated molecule with the SH3 domain of signal transducing adaptor molecule) specifically interacts with phosphorylated Xp95 in M-phase cell lysates. These findings establish that Xp95/Alix is phosphorylated within the PRD during M-phase induction, and indicate that the phosphorylation may both positively and negatively modulate their interaction with partner proteins.
Phosphorylation of the proline-rich domain of Xp95 modulates Xp95 interaction with partner proteins.
Resumo:
The mammalian adaptor protein Alix [ALG-2 (apoptosis-linked-gene-2 product)-interacting protein X] belongs to a conserved family of proteins that have in common an N-terminal Bro1 domain and a C-terminal PRD (proline-rich domain), both of which mediate partner protein interactions. Following our previous finding that Xp95, the Xenopus orthologue of Alix, undergoes a phosphorylation-dependent gel mobility shift during progesteroneinduced oocyte meiotic maturation, we explored potential regulation of Xp95/Alix by protein phosphorylation in hormone-induced cell cycle re-entry or M-phase induction. By MALDI-TOF (matrix-assisted laser-desorption ionization-time-of-flight) MS analyses and gel mobility-shift assays, Xp95 is phosphorylated at multiple sites within the N-terminal half of the PRD during Xenopus oocyte maturation, and a similar region in Alix is phosphorylated in mitotically arrested but not serum-stimulated mammalian cells. By tandem MS, Thr745 within this region, which localizes in a conserved binding site to the adaptor protein SETA [SH3 (Src homology 3) domain-containing, expressed in tumorigenic astrocytes] CIN85 (a-cyano-4-hydroxycinnamate)/SH3KBP1 (SH3-domain kinase-binding protein 1), is one of the phosphorylation sites in Xp95. Results from GST (glutathione S-transferase)-pull down and peptide binding/competition assays further demonstrate that the Thr745 phosphorylation inhibits Xp95 interaction with the second SH3 domain of SETA. However, immunoprecipitates of Xp95 from extracts of M-phase-arrested mature oocytes contained additional partner proteins as compared with immunoprecipitates from extracts of G2-arrested immature oocytes. The deubiquitinase AMSH (associated molecule with the SH3 domain of signal transducing adaptor molecule) specifically interacts with phosphorylated Xp95 in M-phase cell lysates. These findings establish that Xp95/Alix is phosphorylated within the PRD during M-phase induction, and indicate that the phosphorylation may both positively and negatively modulate their interaction with partner proteins.
Resumo:
Answering run-time questions in object-oriented systems involves reasoning about and exploring connections between multiple objects. Developer questions exercise various aspects of an object and require multiple kinds of interactions depending on the relationships between objects, the application domain and the differing developer needs. Nevertheless, traditional object inspectors, the essential tools often used to reason about objects, favor a generic view that focuses on the low-level details of the state of individual objects. This leads to an inefficient effort, increasing the time spent in the inspector. To improve the inspection process, we propose the Moldable Inspector, a novel approach for an extensible object inspector. The Moldable Inspector allows developers to look at objects using multiple interchangeable presentations and supports a workflow in which multiple levels of connecting objects can be seen together. Both these aspects can be tailored to the domain of the objects and the question at hand. We further exemplify how the proposed solution improves the inspection process, introduce a prototype implementation and discuss new directions for extending the Moldable Inspector.
Resumo:
Debuggers are crucial tools for developing object-oriented software systems as they give developers direct access to the running systems. Nevertheless, traditional debuggers rely on generic mechanisms to explore and exhibit the execution stack and system state, while developers reason about and formulate domain-specific questions using concepts and abstractions from their application domains. This creates an abstraction gap between the debugging needs and the debugging support leading to an inefficient and error-prone debugging effort. To reduce this gap, we propose a framework for developing domain-specific debuggers called the Moldable Debugger. The Moldable Debugger is adapted to a domain by creating and combining domain-specific debugging operations with domain-specific debugging views, and adapts itself to a domain by selecting, at run time, appropriate debugging operations and views. We motivate the need for domain-specific debugging, identify a set of key requirements and show how our approach improves debugging by adapting the debugger to several domains.
Resumo:
The marine laboratories in Plymouth have sampled at two principle sites in the Western English Channel for over a century in open-shelf (station E1; 50° 02'N, 4° 22'W) and coastal (station L4; 50° 15'N, 4° 13'W) waters. These stations are seasonally stratified from late-April until September, and the variable biological response is regulated by subtle variations in temperature, light, nutrients and meteorology. Station L4 is characterized by summer nutrient depletion, although intense summer precipitation, increasing riverine input to the system, results in pulses of increased nitrate concentration and surface freshening. The winter nutrient concentrations at E1 are consistent with an open-shelf site. Both stations have a spring and autumn phytoplankton bloom; at station E1, the autumn bloom tends to dominate in terms of chlorophyll concentration. The last two decades have seen a warming of around 0.6°C per decade, and this is superimposed on several periods of warming and cooling over the past century. In general, over the Western English Channel domain, the end of the 20th century was around 0.5°C warmer than the first half of the century. The warming magnitude and trend is consistent with other stations across the north-west European Shelf and occurred during a period of reduced wind stress and increased levels of insolation (+20%); these are both correlated with the larger scale climatic forcing of the North Atlantic Oscillation.
Resumo:
The marine laboratories in Plymouth have sampled at two principle sites in the Western English Channel for over a century in open-shelf (station E1; 50° 02'N, 4° 22'W) and coastal (station L4; 50° 15'N, 4° 13'W) waters. These stations are seasonally stratified from late-April until September, and the variable biological response is regulated by subtle variations in temperature, light, nutrients and meteorology. Station L4 is characterized by summer nutrient depletion, although intense summer precipitation, increasing riverine input to the system, results in pulses of increased nitrate concentration and surface freshening. The winter nutrient concentrations at E1 are consistent with an open-shelf site. Both stations have a spring and autumn phytoplankton bloom; at station E1, the autumn bloom tends to dominate in terms of chlorophyll concentration. The last two decades have seen a warming of around 0.6°C per decade, and this is superimposed on several periods of warming and cooling over the past century. In general, over the Western English Channel domain, the end of the 20th century was around 0.5°C warmer than the first half of the century. The warming magnitude and trend is consistent with other stations across the north-west European Shelf and occurred during a period of reduced wind stress and increased levels of insolation (+20%); these are both correlated with the larger scale climatic forcing of the North Atlantic Oscillation.
Resumo:
The evolution of water content on a sandy soil during the sprinkler irrigation campaign, in the summer of 2010, of a field of sugar beet crop located at Valladolid (Spain) is assessed by a capacitive FDR (Frequency Domain Reflectometry) EnviroScan. This field is one of the experimental sites of the Spanish research center for the sugar beet development (AIMCRA). The objective of the work focus on monitoring the soil water content evolution of consecutive irrigations during the second two weeks of July (from the 12th to the 28th). These measurements will be used to simulate water movement by means of Hydrus-2D. The water probe logged water content readings (m3/m3) at 10, 20, 40 and 60 cm depth every 30 minutes. The probe was placed between two rows in one of the typical 12 x 15 m sprinkler irrigation framework. Furthermore, a texture analysis at the soil profile was also conducted. The irrigation frequency in this farm was set by the own personal farmer 0 s criteria that aiming to minimizing electricity pumping costs, used to irrigate at night and during the weekend i.e. longer irrigation frequency than expected. However, the high evapotranspiration rates and the weekly sugar beet water consumption—up to 50mm/week—clearly determined the need for lower this frequency. Moreover, farmer used to irrigate for six or five hours whilst results from the EnviroScan probe showed the soil profile reaching saturation point after the first three hours. It must be noted that AIMCRA provides to his members with a SMS service regarding weekly sugar beet water requirement; from the use of different meteorological stations and evapotranspiration pans, farmers have an idea of the weekly irrigation needs. Nevertheless, it is the farmer 0 s decision to decide how to irrigate. Thus, in order to minimize water stress and pumping costs, a suitable irrigation time and irrigation frequency was modeled with Hydrus-2D. Results for the period above mentioned showed values of water content ranging from 35 and 30 (m3/m3) for the first 10 and 20cm profile depth (two hours after irrigation) to the minimum 14 and 13 (m3/m3) ( two hours before irrigation). For the 40 and 60 cm profile depth, water content moves steadily across the dates: The greater the root activity the greater the water content variation. According to the results in the EnviroScan probe and the modeling in Hydrus-2D, shorter frequencies and irrigation times are suggested.
Resumo:
Traditional schemes for abstract interpretation-based global analysis of logic programs generally focus on obtaining procedure argument mode and type information. Variable sharing information is often given only the attention needed to preserve the correctness of the analysis. However, such sharing information can be very useful. In particular, it can be used for predicting runtime goal independence, which can eliminate costly run-time checks in and-parallel execution. In this paper, a new algorithm for doing abstract interpretation in logic programs is described which concentrates on inferring the dependencies of the terms bound to program variables with increased precisión and at all points in the execution of the program, rather than just at a procedure level. Algorithms are presented for computing abstract entry and success substitutions which extensively keep track of variable aliasing and term dependence information. In addition, a new, abstract domain independent ñxpoint algorithm is presented and described in detail. The algorithms are illustrated with examples. Finally, results from an implementation of the abstract interpreter are presented.
Resumo:
This paper discusses some issues which arise in the dataflow analysis of constraint logic programming (CLP) languages. The basic technique applied is that of abstract interpretation. First, some types of optimizations possible in a number of CLP systems (including efficient parallelization) are presented and the information that has to be obtained at compile-time in order to be able to implement such optimizations is considered. Two approaches are then proposed and discussed for obtaining this information for a CLP program: one based on an analysis of a CLP metainterpreter using standard Prolog analysis tools, and a second one based on direct analysis of the CLP program. For the second approach an abstract domain which approximates groundness (also referred to as "definiteness") information (i.e. constraint to a single valué) and the related abstraction functions are presented.
Resumo:
We study the first passage statistics to adsorbing boundaries of a Brownian motion in bounded two-dimensional domains of different shapes and configurations of the adsorbing and reflecting boundaries. From extensive numerical analysis we obtain the probability P(ω) distribution of the random variable ω=τ1/(τ1+τ2), which is a measure for how similar the first passage times τ1 and τ2 are of two independent realizations of a Brownian walk starting at the same location. We construct a chart for each domain, determining whether P(ω) represents a unimodal, bell-shaped form, or a bimodal, M-shaped behavior. While in the former case the mean first passage time (MFPT) is a valid characteristic of the first passage behavior, in the latter case it is an insufficient measure for the process. Strikingly we find a distinct turnover between the two modes of P(ω), characteristic for the domain shape and the respective location of absorbing and reflective boundaries. Our results demonstrate that large fluctuations of the first passage times may occur frequently in two-dimensional domains, rendering quite vague the general use of the MFPT as a robust measure of the actual behavior even in bounded domains, in which all moments of the first passage distribution exist.
Resumo:
The design of nuclear power plant has to follow a number of regulations aimed at limiting the risks inherent in this type of installation. The goal is to prevent and to limit the consequences of any possible incident that might threaten the public or the environment. To verify that the safety requirements are met a safety assessment process is followed. Safety analysis is as key component of a safety assessment, which incorporates both probabilistic and deterministic approaches. The deterministic approach attempts to ensure that the various situations, and in particular accidents, that are considered to be plausible, have been taken into account, and that the monitoring systems and engineered safety and safeguard systems will be capable of ensuring the safety goals. On the other hand, probabilistic safety analysis tries to demonstrate that the safety requirements are met for potential accidents both within and beyond the design basis, thus identifying vulnerabilities not necessarily accessible through deterministic safety analysis alone. Probabilistic safety assessment (PSA) methodology is widely used in the nuclear industry and is especially effective in comprehensive assessment of the measures needed to prevent accidents with small probability but severe consequences. Still, the trend towards a risk informed regulation (RIR) demanded a more extended use of risk assessment techniques with a significant need to further extend PSA’s scope and quality. Here is where the theory of stimulated dynamics (TSD) intervenes, as it is the mathematical foundation of the integrated safety assessment (ISA) methodology developed by the CSN(Consejo de Seguridad Nuclear) branch of Modelling and Simulation (MOSI). Such methodology attempts to extend classical PSA including accident dynamic analysis, an assessment of the damage associated to the transients and a computation of the damage frequency. The application of this ISA methodology requires a computational framework called SCAIS (Simulation Code System for Integrated Safety Assessment). SCAIS provides accident dynamic analysis support through simulation of nuclear accident sequences and operating procedures. Furthermore, it includes probabilistic quantification of fault trees and sequences; and integration and statistic treatment of risk metrics. SCAIS comprehensively implies an intensive use of code coupling techniques to join typical thermal hydraulic analysis, severe accident and probability calculation codes. The integration of accident simulation in the risk assessment process and thus requiring the use of complex nuclear plant models is what makes it so powerful, yet at the cost of an enormous increase in complexity. As the complexity of the process is primarily focused on such accident simulation codes, the question of whether it is possible to reduce the number of required simulation arises, which will be the focus of the present work. This document presents the work done on the investigation of more efficient techniques applied to the process of risk assessment inside the mentioned ISA methodology. Therefore such techniques will have the primary goal of decreasing the number of simulation needed for an adequate estimation of the damage probability. As the methodology and tools are relatively recent, there is not much work done inside this line of investigation, making it a quite difficult but necessary task, and because of time limitations the scope of the work had to be reduced. Therefore, some assumptions were made to work in simplified scenarios best suited for an initial approximation to the problem. The following section tries to explain in detail the process followed to design and test the developed techniques. Then, the next section introduces the general concepts and formulae of the TSD theory which are at the core of the risk assessment process. Afterwards a description of the simulation framework requirements and design is given. Followed by an introduction to the developed techniques, giving full detail of its mathematical background and its procedures. Later, the test case used is described and result from the application of the techniques is shown. Finally the conclusions are presented and future lines of work are exposed.
Resumo:
Partitioning is a common approach to developing mixed-criticality systems, where partitions are isolated from each other both in the temporal and the spatial domain in order to prevent low-criticality subsystems from compromising other subsystems with high level of criticality in case of misbehaviour. The advent of many-core processors, on the other hand, opens the way to highly parallel systems in which all partitions can be allocated to dedicated processor cores. This trend will simplify processor scheduling, although other issues such as mutual interference in the temporal domain may arise as a consequence of memory and device sharing. The paper describes an architecture for multi-core partitioned systems including critical subsystems built with the Ada Ravenscar profile. Some implementation issues are discussed, and experience on implementing the ORK kernel on the XtratuM partitioning hypervisor is presented.
Resumo:
Knowledge resource reuse has become a popular approach within the ontology engineering field, mainly because it can speed up the ontology development process, saving time and money and promoting the application of good practices. The NeOn Methodology provides guidelines for reuse. These guidelines include the selection of the most appropriate knowledge resources for reuse in ontology development. This is a complex decision-making problem where different conflicting objectives, like the reuse cost, understandability, integration workload and reliability, have to be taken into account simultaneously. GMAA is a PC-based decision support system based on an additive multi-attribute utility model that is intended to allay the operational difficulties involved in the Decision Analysis methodology. The paper illustrates how it can be applied to select multimedia ontologies for reuse to develop a new ontology in the multimedia domain. It also demonstrates that the sensitivity analyses provided by GMAA are useful tools for making a final recommendation.