845 resultados para Event-based timing
Resumo:
The sharing of near real-time traceability knowledge in supply chains plays a central role in coordinating business operations and is a key driver for their success. However before traceability datasets received from external partners can be integrated with datasets generated internally within an organisation, they need to be validated against information recorded for the physical goods received as well as against bespoke rules defined to ensure uniformity, consistency and completeness within the supply chain. In this paper, we present a knowledge driven framework for the runtime validation of critical constraints on incoming traceability datasets encapuslated as EPCIS event-based linked pedigrees. Our constraints are defined using SPARQL queries and SPIN rules. We present a novel validation architecture based on the integration of Apache Storm framework for real time, distributed computation with popular Semantic Web/Linked data libraries and exemplify our methodology on an abstraction of the pharmaceutical supply chain.
Resumo:
The EPCIS specification provides an event oriented mechanism to record product movement information across stakeholders in supply chain business processes. Besides enabling the sharing of event-based traceability datasets, track and trace implementations must also be equipped with the capabilities to validate integrity constraints and detect runtime exceptions without compromising the time-to-deliver schedule of the shipping and receiving parties. In this paper we present a methodology for detecting exceptions arising during the processing of EPCIS event datasets. We propose an extension to the EEM ontology for modelling EPCIS exceptions and show how runtime exceptions can be detected and reported. We exemplify and evaluate our approach on an abstraction of pharmaceutical supply chains.
Resumo:
Supply chains comprise of complex processes spanning across multiple trading partners. The various operations involved generate large number of events that need to be integrated in order to enable internal and external traceability. Further, provenance of artifacts and agents involved in the supply chain operations is now a key traceability requirement. In this paper we propose a Semantic web/Linked data powered framework for the event based representation and analysis of supply chain activities governed by the EPCIS specification. We specifically show how a new EPCIS event type called "Transformation Event" can be semantically annotated using EEM - The EPCIS Event Model to generate linked data, that can be exploited for internal event based traceability in supply chains involving transformation of products. For integrating provenance with traceability, we propose a mapping from EEM to PROV-O. We exemplify our approach on an abstraction of the production processes that are part of the wine supply chain.
Resumo:
The sharing of product and process information plays a central role in coordinating supply chains operations and is a key driver for their success. "Linked pedigrees" - linked datasets, that encapsulate event based traceability information of artifacts as they move along the supply chain, provide a scalable mechanism to record and facilitate the sharing of track and trace knowledge among supply chain partners. In this paper we present "OntoPedigree" a content ontology design pattern for the representation of linked pedigrees, that can be specialised and extended to define domain specific traceability ontologies. Events captured within the pedigrees are specified using EPCIS - a GS1 standard for the specification of traceability information within and across enterprises, while certification information is described using PROV - a vocabulary for modelling provenance of resources. We exemplify the utility of OntoPedigree in linked pedigrees generated for supply chains within the perishable goods and pharmaceuticals sectors.
Resumo:
Central clearing and the role of central counterparties (CCP) has gained on importance in the financial sector, since counterparty risk of the trading is to be managed by them. The regulation has turned towards them lately, by defining several processes, how CCPs should measure and manage their risk. Stress situation is an important term of the regulation, however it is not specified clearly, how stress should be identified. This paper provides a possible definition of stress event based on the existing risk management methodology: the usage of risk measure oversteps, and investigates the potential stress periods of the last years on the Hungarian stock market. According to the results the definition needs further calibration based on the magnitude of the cross-sectional data. The paper examines furthermore whether stress is to be predicted from market liquidity. The connection of liquidity and market turmoil proved to be contrary to the expectations; liquidity shortage was rather a consequence, than a forecaster phenomenon in the tested period.
Resumo:
A Partial Waves Analysis (PWA) of γp → Δ ++X → pπ+ π - (η) data taken with the CLAS detector at Jefferson Lab is presented in this work. This reaction is of interest because the Δ++ restricts the isospin of the possible X states, leaving the PWA with a smaller combination of partial waves, making it ideal to look for exotic mesons. It was proposed by Isgur and Paton that photoproduction is a plausible source for the Jpc=1–+ state through flux tube excitation. The π1(1400) is such a state that has been produced with the use of hadron production but it has yet to be seen in photoproduction. A mass independent amplitude analysis of this channel was performed, followed by a mass dependent fit to extract the resonance parameters. The procedure used an event-based maximum likelihood method to maintain all correlations in the kinematics. The intensity and phase motion is mapped out for the contributing signals without requiring assumptions about the underlying processes. The strength of the PWA is in the analysis of the phase motion, which for resonance behavior is well defined. In the data presented, the ηπ– invariant mass spectrum shows contributions from the a0(980) and a2(1320) partial waves. No π1 was observed under a clear a2 signal after the angular distributions of the decay products were analyzed using an amplitude analysis. In addition, this dissertation discusses trends in the data, along with the implemented techniques.
Resumo:
We argue that considering transitions at the same level as states, as first-class citizens, is advantageous in many cases. Namely, the use of atomic propositions on transitions, as well as on states, allows temporal formulas and strategies to be more powerful, general, and meaningful. We define egalitarian structures and logics, and show how they generalize well-known state-based, event-based, and mixed ones. We present translations from egalitarian to non-egalitarian settings that, in particular, allow the model checking of LTLR formulas using Maude’s LTL model checker. We have implemented these translations as a prototype in Maude itself.
Resumo:
Oxygen and carbon isotope analyses were performed on monospecific or mixed-species samples of benthic foraminifers, as well as on the planktonic species Globigerinoides ruber from a 24-m hydraulic piston core raised on the western flank of the Rio Grande Rise, at DSDP Site 517 (30°56.81'S and 38°02.47'W, water depth 2963 m) in the southwestern Atlantic. This site is presently located in the core of North Atlantic Deep Water (NADW). This is the first long isotopic record of Quaternary benthic foraminifers; it displays at least 30 isotopic stages, 25 of them readily correlated with the standard sequence of Pacific Core V28-239. The depths of both the Bruhnes/Matuyama boundary and the Jaramillo Event based on oxygen isotope stratigraphy agree well with paleomagnetic results. Quaternary faunal data from this part of the Atlantic are dated through isotopic stratigraphy and partially contradict data previously published by Williams and Ledbetter (1979). There was a substantial increase in the size of the earth's major ice sheets culminating at Stage 22 and corresponding to a l per mil progressive increase of d18O maximal values. Further, ice volume-induced isotopic changes were not identical for different glacial cycles. Oxygen and carbon isotope analyses of benthic foraminifers show that during Pleistocene glacial episodes, NADW was cooler than today and that Mediterranean outflow might still have contributed to the NADW sources. The comparison of coiling ratio changes of Globorotalia truncatulinoides with planktonic and benthic oxygen isotope records shows that there might have been southward excursions of the Brazil Current during the Pleistocene, perhaps related to Antarctic surface water surges. The question of the location of NADW sources during glacial maxima remains open.
Resumo:
Internet users consume online targeted advertising based on information collected about them and voluntarily share personal information in social networks. Sensor information and data from smart-phones is collected and used by applications, sometimes in unclear ways. As it happens today with smartphones, in the near future sensors will be shipped in all types of connected devices, enabling ubiquitous information gathering from the physical environment, enabling the vision of Ambient Intelligence. The value of gathered data, if not obvious, can be harnessed through data mining techniques and put to use by enabling personalized and tailored services as well as business intelligence practices, fueling the digital economy. However, the ever-expanding information gathering and use undermines the privacy conceptions of the past. Natural social practices of managing privacy in daily relations are overridden by socially-awkward communication tools, service providers struggle with security issues resulting in harmful data leaks, governments use mass surveillance techniques, the incentives of the digital economy threaten consumer privacy, and the advancement of consumergrade data-gathering technology enables new inter-personal abuses. A wide range of fields attempts to address technology-related privacy problems, however they vary immensely in terms of assumptions, scope and approach. Privacy of future use cases is typically handled vertically, instead of building upon previous work that can be re-contextualized, while current privacy problems are typically addressed per type in a more focused way. Because significant effort was required to make sense of the relations and structure of privacy-related work, this thesis attempts to transmit a structured view of it. It is multi-disciplinary - from cryptography to economics, including distributed systems and information theory - and addresses privacy issues of different natures. As existing work is framed and discussed, the contributions to the state-of-theart done in the scope of this thesis are presented. The contributions add to five distinct areas: 1) identity in distributed systems; 2) future context-aware services; 3) event-based context management; 4) low-latency information flow control; 5) high-dimensional dataset anonymity. Finally, having laid out such landscape of the privacy-preserving work, the current and future privacy challenges are discussed, considering not only technical but also socio-economic perspectives.
Resumo:
Gap junction coupling is ubiquitous in the brain, particularly between the dendritic trees of inhibitory interneurons. Such direct non-synaptic interaction allows for direct electrical communication between cells. Unlike spike-time driven synaptic neural network models, which are event based, any model with gap junctions must necessarily involve a single neuron model that can represent the shape of an action potential. Indeed, not only do neurons communicating via gaps feel super-threshold spikes, but they also experience, and respond to, sub-threshold voltage signals. In this chapter we show that the so-called absolute integrate-and-fire model is ideally suited to such studies. At the single neuron level voltage traces for the model may be obtained in closed form, and are shown to mimic those of fast-spiking inhibitory neurons. Interestingly in the presence of a slow spike adaptation current the model is shown to support periodic bursting oscillations. For both tonic and bursting modes the phase response curve can be calculated in closed form. At the network level we focus on global gap junction coupling and show how to analyze the asynchronous firing state in large networks. Importantly, we are able to determine the emergence of non-trivial network rhythms due to strong coupling instabilities. To illustrate the use of our theoretical techniques (particularly the phase-density formalism used to determine stability) we focus on a spike adaptation induced transition from asynchronous tonic activity to synchronous bursting in a gap-junction coupled network.
Resumo:
Dissertação de Mestrado, Neurociências Cognitivas e Neuropsicologia, Faculdade de Ciências Humanas e Sociais, Universidade do Algarve, 2016
Resumo:
The first goal of this study is to analyse a real-world multiproduct onshore pipeline system in order to verify its hydraulic configuration and operational feasibility by constructing a simulation model step by step from its elementary building blocks that permits to copy the operation of the real system as precisely as possible. The second goal is to develop this simulation model into a user-friendly tool that one could use to find an “optimal” or “best” product batch schedule for a one year time period. Such a batch schedule could change dynamically as perturbations occur during operation that influence the behaviour of the entire system. The result of the simulation, the ‘best’ batch schedule is the one that minimizes the operational costs in the system. The costs involved in the simulation are inventory costs, interface costs, pumping costs, and penalty costs assigned to any unforeseen situations. The key factor to determine the performance of the simulation model is the way time is represented. In our model an event based discrete time representation is selected as most appropriate for our purposes. This means that the time horizon is divided into intervals of unequal lengths based on events that change the state of the system. These events are the arrival/departure of the tanker ships, the openings and closures of loading/unloading valves of storage tanks at both terminals, and the arrivals/departures of trains/trucks at the Delivery Terminal. In the feasibility study we analyse the system’s operational performance with different Head Terminal storage capacity configurations. For these alternative configurations we evaluated the effect of different tanker ship delay magnitudes on the number of critical events and product interfaces generated, on the duration of pipeline stoppages, the satisfaction of the product demand and on the operative costs. Based on the results and the bottlenecks identified, we propose modifications in the original setup.
Resumo:
With organisations facing significant challenges to remain competitive, Business Process Improvement (BPI) initiatives are often conducted to improve the efficiency and effectiveness of their business processes, focussing on time, cost, and quality improvements. Event logs which contain a detailed record of business operations over a certain time period, recorded by an organisation's information systems, are the first step towards initiating evidence-based BPI activities. Given an (original) event log as a starting point, an approach to explore better ways to execute a business process was developed, resulting in an improved (perturbed) event log. Identifying the differences between the original event log and the perturbed event log can provide valuable insights, helping organisations to improve their processes. However, there is a lack of automated techniques to detect the differences between two event logs. Therefore, this research aims to develop visualisation techniques to provide targeted analysis of resource reallocation and activity rescheduling. The differences between two event logs are first identified. The changes between the two event logs are conceptualised and realised with a number of visualisations. With the proposed visualisations, analysts will then be able to identify the changes related to resource and time, resulting in a more efficient business process. Ultimately, analysts can make use of this comparative information to initiate evidence-based BPI activities.
Resumo:
The Tokai to Kamioka (T2K) long-baseline neutrino experiment consists of a muon neutrino beam, produced at the J-PARC accelerator, a near detector complex and a large 295 km distant far detector. The present work utilizes the T2K event timing measurements at the near and far detectors to study neutrino time of flight as function of derived neutrino energy. Under the assumption of a relativistic relation between energy and time of flight, constraints on the neutrino rest mass can be derived. The sub-GeV neutrino beam in conjunction with timing precision of order tens of ns provide sensitivity to neutrino mass in the few MeV/c^2 range. We study the distribution of relative arrival times of muon and electron neutrino candidate events at the T2K far detector as a function of neutrino energy. The 90% C.L. upper limit on the mixture of neutrino mass eigenstates represented in the data sample is found to be m^2 < 5.6 MeV^2/c^4.