845 resultados para Event-based timing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’approche cognitive du trouble obsessionnel-compulsif (TOC) propose un lien bidirectionnel entre les émotions et les cognitions. Cependant, même si des études montrent une association entre les émotions et le TOC, aucune étude ne s’est attardée à la relation entre les émotions, les cognitions et les comportements au cours d’une thérapie cognitive. La présente étude a pour but d’examiner la relation entre les processus cognitif, béhavioral et émotionnel au cours d’une thérapie basée sur les inférences (TBI) chez des personnes souffrant du TOC. Plus précisément, nous avons observé comment les émotions et les symptômes du TOC s’influencent et comment ils s’influencent à travers le temps. Les patients ont rempli un journal de bord tout au long du processus thérapeutique, notant (de 0 à 100) des émotions clés, ainsi que les croyances et les comportements ciblés durant la thérapie. Des analyses à mesures répétées ont été utilisées afin de maximiser le potentiel des données longitudinales. Les résultats montrent que l’anxiété, la tristesse et la joie ont des trajectoires similaires aux croyances et aux comportements au cours de la thérapie. Les forces et limites de l’étude sont discutées. Les implications des résultats pour le traitement des émotions et des pensées à différents moments de la thérapie sont aussi discutées.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A multi-scale framework for decision support is presented that uses a combination of experiments, models, communication, education and decision support tools to arrive at a realistic strategy to minimise diffuse pollution. Effective partnerships between researchers and stakeholders play a key part in successful implementation of this strategy. The Decision Support Matrix (DSM) is introduced as a set of visualisations that can be used at all scales, both to inform decision making and as a communication tool in stakeholder workshops. A demonstration farm is presented and one of its fields is taken as a case study. Hydrological and nutrient flow path models are used for event based simulation (TOPCAT), catchment scale modelling (INCA) and field scale flow visualisation (TopManage). One of the DSMs; The Phosphorus Export Risk Matrix (PERM) is discussed in detail. The PERM was developed iteratively as a point of discussion in stakeholder workshops, as a decision support and education tool. The resulting interactive PERM contains a set of questions and proposed remediation measures that reflect both expert and local knowledge. Education and visualisation tools such as GIS, risk indicators, TopManage and the PERM are found to be invaluable in communicating improved farming practice to stakeholders. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Research on arable sandy loam and silty clay loam soils on 4° slopes in England has shown that tramlines (i.e. the unseeded wheeling areas used to facilitate spraying operations in cereal crops) can represent the most important pathway for phosphorus and sediment loss from moderately sloping fields. Detailed monitoring over the October–March period in winters 2005–2006 and 2006–2007 included event-based sampling of surface runoff, suspended and particulate sediment, and dissolved and particulate phosphorus from hillslope segments (each ∼300–800 m2) established in a randomized block design with four replicates of each treatment at each of two sites on lighter and heavier soils. Experimental treatments assessed losses from the cropped area without tramlines, and from the uncropped tramline area, and were compared to losses from tramlines which had been disrupted once in the autumn with a shallow tine. On the lighter soil, the effects of removal or shallow incorporation of straw residues was also determined. Research on both sandy and silty clay loam soils across two winters showed that tramline wheelings represented the dominant pathway for surface runoff and transport of sediment, phosphorus and nitrogen from cereal crops on moderate slopes. Results indicated 5·5–15·8% of rainfall lost as runoff, and losses of 0·8–2·9 kg TP ha−1 and 0·3–4·8 t ha−1 sediment in tramline treatments, compared to only 0·2–1·7% rainfall lost as runoff, and losses of 0·0–0·2 kg TP ha−1 and 0·003–0·3 t ha−1 sediment from treatments without tramlines or those where tramlines had been disrupted. The novel shallow disruption of tramline wheelings using a tine once following the autumn spray operation consistently and dramatically reduced (p < 0·001) surface runoff and loads of sediment, total nitrogen and total phosphorus to levels similar to those measured in cropped areas between tramlines. Results suggest that options for managing tramline wheelings warrant further refinement and evaluation with a view to incorporating them into spatially-targeted farm-level management planning using national or catchment-based agri-environment policy instruments aimed at reducing diffuse pollution from land to surface water systems. Copyright © 2010 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The current study investigated the influence of encoding modality and cue-action relatedness on prospective memory (PM) performance in young and older adults using a modified version of the Virtual Week task. Participants encoded regular and irregular intentions either verbally or by physically performing the action during encoding. For half of the intentions there was a close semantic relation between the retrieval cue and the intended action, while for the remaining intentions the cue and action were semantically unrelated. For irregular tasks, both age groups showed superior PM for related intentions compared to unrelated intentions in both encoding conditions. While older adults retrieved fewer irregular intentions than young adults after verbal encoding, there was no age difference following enactment. Possible mechanisms of enactment and relatedness effects are discussed in the context of current theories of event-based PM.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is thought that speciation in phytophagous insects is often due to colonization of novel host plants, because radiations of plant and insect lineages are typically asynchronous. Recent phylogenetic comparisons have supported this model of diversification for both insect herbivores and specialized pollinators. An exceptional case where contemporaneous plant insect diversification might be expected is the obligate mutualism between fig trees (Ficus species, Moraceae) and their pollinating wasps (Agaonidae, Hymenoptera). The ubiquity and ecological significance of this mutualism in tropical and subtropical ecosystems has long intrigued biologists, but the systematic challenge posed by >750 interacting species pairs has hindered progress toward understanding its evolutionary history. In particular, taxon sampling and analytical tools have been insufficient for large-scale co-phylogenetic analyses. Here, we sampled nearly 200 interacting pairs of fig and wasp species from across the globe. Two supermatrices were assembled: on average, wasps had sequences from 77% of six genes (5.6kb), figs had sequences from 60% of five genes (5.5 kb), and overall 850 new DNA sequences were generated for this study. We also developed a new analytical tool, Jane 2, for event-based phylogenetic reconciliation analysis of very large data sets. Separate Bayesian phylogenetic analyses for figs and fig wasps under relaxed molecular clock assumptions indicate Cretaceous diversification of crown groups and contemporaneous divergence for nearly half of all fig and pollinator lineages. Event-based co-phylogenetic analyses further support the co-diversification hypothesis. Biogeographic analyses indicate that the presentday distribution of fig and pollinator lineages is consistent with an Eurasian origin and subsequent dispersal, rather than with Gondwanan vicariance. Overall, our findings indicate that the fig-pollinator mutualism represents an extreme case among plant-insect interactions of coordinated dispersal and long-term co-diversification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Geomagnetic activity has long been known to exhibit approximately 27 day periodicity, resulting from solar wind structures repeating each solar rotation. Thus a very simple near-Earth solar wind forecast is 27 day persistence, wherein the near-Earth solar wind conditions today are assumed to be identical to those 27 days previously. Effective use of such a persistence model as a forecast tool, however, requires the performance and uncertainty to be fully characterized. The first half of this study determines which solar wind parameters can be reliably forecast by persistence and how the forecast skill varies with the solar cycle. The second half of the study shows how persistence can provide a useful benchmark for more sophisticated forecast schemes, namely physics-based numerical models. Point-by-point assessment methods, such as correlation and mean-square error, find persistence skill comparable to numerical models during solar minimum, despite the 27 day lead time of persistence forecasts, versus 2–5 days for numerical schemes. At solar maximum, however, the dynamic nature of the corona means 27 day persistence is no longer a good approximation and skill scores suggest persistence is out-performed by numerical models for almost all solar wind parameters. But point-by-point assessment techniques are not always a reliable indicator of usefulness as a forecast tool. An event-based assessment method, which focusses key solar wind structures, finds persistence to be the most valuable forecast throughout the solar cycle. This reiterates the fact that the means of assessing the “best” forecast model must be specifically tailored to its intended use.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Architectural description languages (ADLs) are used to specify a high-level, compositional view of a software application, specifying how a system is to be composed from coarse-grain components. ADLs usually come equipped with a formal dynamic semantics, facilitating specification and analysis of distributed and event-based systems. In this paper, we describe the TrustME, an ADL framework that provides both a process and a structural view of web service-based systems. We use Petri-net descriptions to give a dynamic view of business workflow for web service collaboration. We adapt the approach of Schmidt to define a form of Meyer's design-by-contract for configuring workflow architectures. This serves as a configuration-level means of constructing safer, more robust systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Architectural description languages (ADLs) are used to specify high-level, compositional view of a software application. ADLs usually come equipped with a rigourous state-transition style semantics, facilitating specification and analysis of distributed and event-based systems. However, enterprise system architectures built upon newer middleware (implementations of Java’s EJB specification, or Microsoft’s COM+/ .NET) require additional expressive power from an ADL. The TrustME ADL is designed to meet this need. In this paper, we describe several aspects of TrustME that facilitate specification and anlysis of middleware-based architectures for the enterprise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Architecture description languages (ADLs) are used to specify high-level, compositional views of a software application. ADL research focuses on software composed of prefabricated parts, so-called software components. ADLs usually come equipped with rigorous state-transition style semantics, facilitating verification and analysis of specifications. Consequently, ADLs are well suited to configuring distributed and event-based systems. However, additional expressive power is required for the description of enterprise software architectures – in particular, those built upon newer middleware, such as implementations of Java’s EJB specification, or Microsoft’s COM+/.NET. The enterprise requires distributed software solutions that are scalable, business-oriented and mission-critical. We can make progress toward attaining these qualities at various stages of the software development process. In particular, progress at the architectural level can be leveraged through use of an ADL that incorporates trust and dependability analysis. Also, current industry approaches to enterprise development do not address several important architectural design issues. The TrustME ADL is designed to meet these requirements, through combining approaches to software architecture specification with rigorous design-by-contract ideas. In this paper, we focus on several aspects of TrustME that facilitate specification and analysis of middleware-based architectures for trusted enterprise computing systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este artigo analisa os principais fatores que influenciam a percepção da população local em relação a megaeventos internacionais, buscando identificar possíveis ações que o Poder Público pode promover para aumentar o apoio e minimizar a resistência à realização desse tipo de evento. Com base no estudo de literatura teórica e empírica sobre o tema, avalia a experiência brasileira na preparação para a Copa do Mundo de 2014 e os Jogos Olímpicos de 2016, e busca retirar algumas lições para futuros megaeventos que venham a ocorrer no País, inclusive a Exposição Universal de 2020.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A presente pesquisa tinha como objetivo compreender como as organizações localizadas em áreas afetadas por desastres naturais atuam para mitigar o risco e qual o papel destas organizações durante o evento. A partir de uma análise documental, o estudo identificou os principais eventos acontecidos no Brasil durante o período de 2003 a 2013 e também as cadeias de suprimentos mais afetadas, bem como os players durante o desastre e os maiores impactos para as atividades econômicas. Os resultados não fornecem indícios que os desastres naturais são considerados na gestão de risco pelas empresas, apesar de serem continuamente afetadas por eles. O poder público, porém, tem aumentado sua preocupação com estes fenômenos

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A challenge that remains in the robotics field is how to make a robot to react in real time to visual stimulus. Traditional computer vision algorithms used to overcome this problem are still very expensive taking too long when using common computer processors. Very simple algorithms like image filtering or even mathematical morphology operations may take too long. Researchers have implemented image processing algorithms in high parallelism hardware devices in order to cut down the time spent in the algorithms processing, with good results. By using hardware implemented image processing techniques and a platform oriented system that uses the Nios II Processor we propose an approach that uses the hardware processing and event based programming to simplify the vision based systems while at the same time accelerating some parts of the used algorithms

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is thought that speciation in phytophagous insects is often due to colonization of novel host plants, because radiations of plant and insect lineages are typically asynchronous. Recent phylogenetic comparisons have supported this model of diversification for both insect herbivores and specialized pollinators. An exceptional case where contemporaneous plant-insect diversification might be expected is the obligate mutualism between fig trees (Ficus species, Moraceae) and their pollinating wasps (Agaonidae, Hymenoptera). The ubiquity and ecological significance of this mutualism in tropical and subtropical ecosystems has long intrigued biologists, but the systematic challenge posed by >750 interacting species pairs has hindered progress toward understanding its evolutionary history. In particular, taxon sampling and analytical tools have been insufficient for large-scale cophylogenetic analyses. Here, we sampled nearly 200 interacting pairs of fig and wasp species from across the globe. Two supermatrices were assembled: on an average, wasps had sequences from 77% of 6 genes (5.6 kb), figs had sequences from 60% of 5 genes (5.5 kb), and overall 850 new DNA sequences were generated for this study. We also developed a new analytical tool, Jane 2, for event-based phylogenetic reconciliation analysis of very large data sets. Separate Bayesian phylogenetic analyses for figs and fig wasps under relaxed molecular clock assumptions indicate Cretaceous diversification of crown groups and contemporaneous divergence for nearly half of all fig and pollinator lineages. Event-based cophylogenetic analyses further support the codiversification hypothesis. Biogeographic analyses indicate that the present-day distribution of fig and pollinator lineages is consistent with a Eurasian origin and subsequent dispersal, rather than with Gondwanan vicariance. Overall, our findings indicate that the fig-pollinator mutualism represents an extreme case among plant-insect interactions of coordinated dispersal and long-term codiversification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Proxy data are essential for the investigation of climate variability on time scales larger than the historical meteorological observation period. The potential value of a proxy depends on our ability to understand and quantify the physical processes that relate the corresponding climate parameter and the signal in the proxy archive. These processes can be explored under present-day conditions. In this thesis, both statistical and physical models are applied for their analysis, focusing on two specific types of proxies, lake sediment data and stable water isotopes.rnIn the first part of this work, the basis is established for statistically calibrating new proxies from lake sediments in western Germany. A comprehensive meteorological and hydrological data set is compiled and statistically analyzed. In this way, meteorological times series are identified that can be applied for the calibration of various climate proxies. A particular focus is laid on the investigation of extreme weather events, which have rarely been the objective of paleoclimate reconstructions so far. Subsequently, a concrete example of a proxy calibration is presented. Maxima in the quartz grain concentration from a lake sediment core are compared to recent windstorms. The latter are identified from the meteorological data with the help of a newly developed windstorm index, combining local measurements and reanalysis data. The statistical significance of the correlation between extreme windstorms and signals in the sediment is verified with the help of a Monte Carlo method. This correlation is fundamental for employing lake sediment data as a new proxy to reconstruct windstorm records of the geological past.rnThe second part of this thesis deals with the analysis and simulation of stable water isotopes in atmospheric vapor on daily time scales. In this way, a better understanding of the physical processes determining these isotope ratios can be obtained, which is an important prerequisite for the interpretation of isotope data from ice cores and the reconstruction of past temperature. In particular, the focus here is on the deuterium excess and its relation to the environmental conditions during evaporation of water from the ocean. As a basis for the diagnostic analysis and for evaluating the simulations, isotope measurements from Rehovot (Israel) are used, provided by the Weizmann Institute of Science. First, a Lagrangian moisture source diagnostic is employed in order to establish quantitative linkages between the measurements and the evaporation conditions of the vapor (and thus to calibrate the isotope signal). A strong negative correlation between relative humidity in the source regions and measured deuterium excess is found. On the contrary, sea surface temperature in the evaporation regions does not correlate well with deuterium excess. Although requiring confirmation by isotope data from different regions and longer time scales, this weak correlation might be of major importance for the reconstruction of moisture source temperatures from ice core data. Second, the Lagrangian source diagnostic is combined with a Craig-Gordon fractionation parameterization for the identified evaporation events in order to simulate the isotope ratios at Rehovot. In this way, the Craig-Gordon model can be directly evaluated with atmospheric isotope data, and better constraints for uncertain model parameters can be obtained. A comparison of the simulated deuterium excess with the measurements reveals that a much better agreement can be achieved using a wind speed independent formulation of the non-equilibrium fractionation factor instead of the classical parameterization introduced by Merlivat and Jouzel, which is widely applied in isotope GCMs. Finally, the first steps of the implementation of water isotope physics in the limited-area COSMO model are described, and an approach is outlined that allows to compare simulated isotope ratios to measurements in an event-based manner by using a water tagging technique. The good agreement between model results from several case studies and measurements at Rehovot demonstrates the applicability of the approach. Because the model can be run with high, potentially cloud-resolving spatial resolution, and because it contains sophisticated parameterizations of many atmospheric processes, a complete implementation of isotope physics will allow detailed, process-oriented studies of the complex variability of stable isotopes in atmospheric waters in future research.rn