986 resultados para Temporal constraints analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Capacity to produce data for performance analysis in sports has been enhanced in the last decade with substantial technological advances. However, current performance analysis methods have been criticised for the lack of a viable theoretical framework to assist on the development of fundamental principles that regulate performance achievement. Our aim in this paper is to discuss ecological dynamics as an explanatory framework for improving analysis and understanding of competitive performance behaviours. We argue that integration of ideas from ecological dynamics into previous approaches to performance analysis advances current understanding of how sport performance emerges from continuous interactions between individual players and teams. Exemplar data from previous studies in association football are presented to illustrate this novel perspective on performance analysis. Limitations of current ecological dynamics research and challenges for future research are discussed in order to improve the meaningfulness of information presented to coaches and managers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this report it was designed an innovative satellite-based monitoring approach applied on the Iraqi Marshlands to survey the extent and distribution of marshland re-flooding and assess the development of wetland vegetation cover. The study, conducted in collaboration with MEEO Srl , makes use of images collected from the sensor (A)ATSR onboard ESA ENVISAT Satellite to collect data at multi-temporal scales and an analysis was adopted to observe the evolution of marshland re-flooding. The methodology uses a multi-temporal pixel-based approach based on classification maps produced by the classification tool SOIL MAPPER ®. The catalogue of the classification maps is available as web service through the Service Support Environment Portal (SSE, supported by ESA). The inundation of the Iraqi marshlands, which has been continuous since April 2003, is characterized by a high degree of variability, ad-hoc interventions and uncertainty. Given the security constraints and vastness of the Iraqi marshlands, as well as cost-effectiveness considerations, satellite remote sensing was the only viable tool to observe the changes taking place on a continuous basis. The proposed system (ALCS – AATSR LAND CLASSIFICATION SYSTEM) avoids the direct use of the (A)ATSR images and foresees the application of LULCC evolution models directly to „stock‟ of classified maps. This approach is made possible by the availability of a 13 year classified image database, conceived and implemented in the CARD project (http://earth.esa.int/rtd/Projects/#CARD).The approach here presented evolves toward an innovative, efficient and fast method to exploit the potentiality of multi-temporal LULCC analysis of (A)ATSR images. The two main objectives of this work are both linked to a sort of assessment: the first is to assessing the ability of modeling with the web-application ALCS using image-based AATSR classified with SOIL MAPPER ® and the second is to evaluate the magnitude, the character and the extension of wetland rehabilitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance to solve the problem of spatial-temporal dynamics analysis in the system of economic security of different subjects of economic management is substantiated. Various methods and approaches for carrying out analysis of spatial-temporal dynamics in the system of economic security are considered. The basis of the generalized analysis of spatial-temporal dynamics in economic systems is offered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methane hydrate is an ice-like substance that is stable at high-pressure and low temperature in continental margin sediments. Since the discovery of a large number of gas flares at the landward termination of the gas hydrate stability zone off Svalbard, there has been concern that warming bottom waters have started to dissociate large amounts of gas hydrate and that the resulting methane release may possibly accelerate global warming. Here, we can corroborate that hydrates play a role in the observed seepage of gas, but we present evidence that seepage off Svalbard has been ongoing for at least three thousand years and that seasonal fluctuations of 1-2°C in the bottom-water temperature cause periodic gas hydrate formation and dissociation, which focus seepage at the observed sites.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Decision-making for conservation is conducted within the margins of limited funding. Furthermore, to allocate these scarce resources we make assumptions about the relationship between management impact and expenditure. The structure of these relationships, however, is rarely known with certainty. We present a summary of work investigating the impact of model uncertainty on robust decision-making in conservation and how this is affected by available conservation funding. We show that achieving robustness in conservation decisions can require a triage approach, and emphasize the need for managers to consider triage not as surrendering but as rational decision making to ensure species persistence in light of the urgency of the conservation problems, uncertainty, and the poor state of conservation funding. We illustrate this theory by a specific application to allocation of funding to reduce poaching impact on the Sumatran tiger Panthera tigris sumatrae in Kerinci Seblat National Park, Indonesia. To conserve our environment, conservation managers must make decisions in the face of substantial uncertainty. Further, they must deal with the fact that limitations in budgets and temporal constraints have led to a lack of knowledge on the systems we are trying to preserve and on the benefits of the actions we have available (Balmford & Cowling 2006). Given this paucity of decision-informing data there is a considerable need to assess the impact of uncertainty on the benefit of management options (Regan et al. 2005). Although models of management impact can improve decision making (e.g.Tenhumberg et al. 2004), they typically rely on assumptions around which there is substantial uncertainty. Ignoring this 'model uncertainty', can lead to inferior decision-making (Regan et al. 2005), and potentially, the loss of the species we are trying to protect. Current methods used in ecology allow model uncertainty to be incorporated into the model selection process (Burnham & Anderson 2002; Link & Barker 2006), but do not enable decision-makers to assess how this uncertainty would change a decision. This is the basis of information-gap decision theory (info-gap); finding strategies most robust to model uncertainty (Ben-Haim 2006). Info-gap has permitted conservation biology to make the leap from recognizing uncertainty to explicitly incorporating severe uncertainty into decision-making. In this paper we present a summary of McDonald-Madden et al (2008a) who use an info-gap framework to address the impact of uncertainty in the functional representations of biological systems on conservation decision-making. Furthermore, we highlight the importance of two key elements limiting conservation decision-making - funding and knowledge - and how they interact to influence the best management strategy for a threatened species. Copyright © ASCE 2011.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Predatory insects and spiders are key elements of integrated pest management (IPM) programmes in agricultural crops such as cotton. Management decisions in IPM programmes should to be based on a reliable and efficient method for counting both predators and pests. Knowledge of the temporal constraints that influence sampling is required because arthropod abundance estimates are likely to vary over a growing season and within a day. Few studies have adequately quantified this effect using the beat sheet, a potentially important sampling method. We compared the commonly used methods of suction and visual sampling to the beat sheet, with reference to an absolute cage clamp method for determining the abundance of various arthropod taxa over 5 weeks. There were significantly more entomophagous arthropods recorded using the beat sheet and cage clamp methods than by using suction or visual sampling, and these differences were more pronounced as the plants grew. In a second trial, relative estimates of entomophagous and phytophagous arthropod abundance were made using beat sheet samples collected over a day. Beat sheet estimates of the abundance of only eight of the 43 taxa examined were found to vary significantly over a day. Beat sheet sampling is recommended in further studies of arthropod abundance in cotton, but researchers and pest management advisors should bear in mind the time of season and time of day effects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Understanding the functioning of a neural system in terms of its underlying circuitry is an important problem in neuroscience. Recent d evelopments in electrophysiology and imaging allow one to simultaneously record activities of hundreds of neurons. Inferring the underlying neuronal connectivity patterns from such multi-neuronal spike train data streams is a challenging statistical and computational problem. This task involves finding significant temporal patterns from vast amounts of symbolic time series data. In this paper we show that the frequent episode mining methods from the field of temporal data mining can be very useful in this context. In the frequent episode discovery framework, the data is viewed as a sequence of events, each of which is characterized by an event type and its time of occurrence and episodes are certain types of temporal patterns in such data. Here we show that, using the set of discovered frequent episodes from multi-neuronal data, one can infer different types of connectivity patterns in the neural system that generated it. For this purpose, we introduce the notion of mining for frequent episodes under certain temporal constraints; the structure of these temporal constraints is motivated by the application. We present algorithms for discovering serial and parallel episodes under these temporal constraints. Through extensive simulation studies we demonstrate that these methods are useful for unearthing patterns of neuronal network connectivity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Personal communication devices are increasingly equipped with sensors that are able to collect and locally store information from their environs. The mobility of users carrying such devices, and hence the mobility of sensor readings in space and time, opens new horizons for interesting applications. In particular, we envision a system in which the collective sensing, storage and communication resources, and mobility of these devices could be leveraged to query the state of (possibly remote) neighborhoods. Such queries would have spatio-temporal constraints which must be met for the query answers to be useful. Using a simplified mobility model, we analytically quantify the benefits from cooperation (in terms of the system's ability to satisfy spatio-temporal constraints), which we show to go beyond simple space-time tradeoffs. In managing the limited storage resources of such cooperative systems, the goal should be to minimize the number of unsatisfiable spatio-temporal constraints. We show that Data Centric Storage (DCS), or "directed placement", is a viable approach for achieving this goal, but only when the underlying network is well connected. Alternatively, we propose, "amorphous placement", in which sensory samples are cached locally, and shuffling of cached samples is used to diffuse the sensory data throughout the whole network. We evaluate conditions under which directed versus amorphous placement strategies would be more efficient. These results lead us to propose a hybrid placement strategy, in which the spatio-temporal constraints associated with a sensory data type determine the most appropriate placement strategy for that data type. We perform an extensive simulation study to evaluate the performance of directed, amorphous, and hybrid placement protocols when applied to queries that are subject to timing constraints. Our results show that, directed placement is better for queries with moderately tight deadlines, whereas amorphous placement is better for queries with looser deadlines, and that under most operational conditions, the hybrid technique gives the best compromise.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, a knowledge-based approach is proposed for the management of temporal information in process control. A common-sense theory of temporal constraints over processes/events, allowing relative temporal knowledge, is employed here as the temporal basis for the system. This theory supports duration reasoning and consistency checking, and accepts relative temporal knowledge which is in a form normally used by human operators. An architecture for process control is proposed which centres on an historical database consisting of events and processes, together with the qualitative temporal relationships between their occurrences. The dynamics of the system is expressed by means of three types of rule: database updating rules, process control rules, and data deletion rules. An example is provided in the form of a life scheduler, to illustrate the database and the rule sets. The example demonstrates the transitions of the database over time, and identifies the procedure in terms of a state transition model for the application. The dividing instant problem for logical inference is discussed with reference to this process control example, and it is shown how the temporal theory employed can be used to deal with the problem.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we propose a statistical model for detection and tracking of human silhouette and the corresponding 3D skeletal structure in gait sequences. We follow a point distribution model (PDM) approach using a Principal Component Analysis (PCA). The problem of non-lineal PCA is partially resolved by applying a different PDM depending of pose estimation; frontal, lateral and diagonal, estimated by Fisher's linear discriminant. Additionally, the fitting is carried out by selecting the closest allowable shape from the training set by means of a nearest neighbor classifier. To improve the performance of the model we develop a human gait analysis to take into account temporal dynamic to track the human body. The incorporation of temporal constraints on the model increase reliability and robustness.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Negotiation is a vital component of electronic trading. It is the key decision-making approach used to reach consensus between trading partners. Generally, the trading partners implement various negotiation strategies in an attempt to maximize their utilities. As negotiation strategies have impact on the outcomes of negotiation, it is imperative to have efficient negotiation strategies that truly maximize clients’ utilities. In this paper, we propose a multi-attribute mobile agent-based negotiation strategy that maximizes client’s utility. The strategy focuses on one-to-many bilateral negotiation. It considers different factors that have significant effect on the scheduling of various negotiation phases: offer collection, evaluation, negotiation, and bid settlement. The factors include offers expiry time, market search space, communication delays, processing queues, and transportation times. We reasoned about the correctness of the proposed negotiation strategy with respect to the existing negotiation strategies. The analysis showed that the proposed strategy boosts client’s utility, shortens negotiation time, and ensures adequate market search.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper assesses the impact of intemational relative prices and domestic expenditure variables on Brazil' s foreign trade performance in the first half of the 1990s. It has been argued that the appreciation of the Real since 1994 has had a detrimental impact of the country's trade balance. However, using temporal precedence analysis, our results do not indicate that the trade balance is strongly affected by intemational rei ative prices, such as the exchange rate. Instead, domestic expenditure variables appear to be more powerful determinant of the country' s trade performance in recent years. Granger and error correction causality techniques are used to determine temporal precedence between the trade balance and the exchange rate in the period under examination. Our findings shed light on the debate over the sustainability of recent exchange rate-anchored macroeconomic stabilisation programmes, which is a topic that has encouraged a lot of debate among academics and practitioners.