949 resultados para event based


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The sharing of product and process information plays a central role in coordinating supply chains operations and is a key driver for their success. "Linked pedigrees" - linked datasets, that encapsulate event based traceability information of artifacts as they move along the supply chain, provide a scalable mechanism to record and facilitate the sharing of track and trace knowledge among supply chain partners. In this paper we present "OntoPedigree" a content ontology design pattern for the representation of linked pedigrees, that can be specialised and extended to define domain specific traceability ontologies. Events captured within the pedigrees are specified using EPCIS - a GS1 standard for the specification of traceability information within and across enterprises, while certification information is described using PROV - a vocabulary for modelling provenance of resources. We exemplify the utility of OntoPedigree in linked pedigrees generated for supply chains within the perishable goods and pharmaceuticals sectors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Central clearing and the role of central counterparties (CCP) has gained on importance in the financial sector, since counterparty risk of the trading is to be managed by them. The regulation has turned towards them lately, by defining several processes, how CCPs should measure and manage their risk. Stress situation is an important term of the regulation, however it is not specified clearly, how stress should be identified. This paper provides a possible definition of stress event based on the existing risk management methodology: the usage of risk measure oversteps, and investigates the potential stress periods of the last years on the Hungarian stock market. According to the results the definition needs further calibration based on the magnitude of the cross-sectional data. The paper examines furthermore whether stress is to be predicted from market liquidity. The connection of liquidity and market turmoil proved to be contrary to the expectations; liquidity shortage was rather a consequence, than a forecaster phenomenon in the tested period.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Partial Waves Analysis (PWA) of γp → Δ ++X → pπ+ π - (η) data taken with the CLAS detector at Jefferson Lab is presented in this work. This reaction is of interest because the Δ++ restricts the isospin of the possible X states, leaving the PWA with a smaller combination of partial waves, making it ideal to look for exotic mesons. It was proposed by Isgur and Paton that photoproduction is a plausible source for the Jpc=1–+ state through flux tube excitation. The π1(1400) is such a state that has been produced with the use of hadron production but it has yet to be seen in photoproduction. A mass independent amplitude analysis of this channel was performed, followed by a mass dependent fit to extract the resonance parameters. The procedure used an event-based maximum likelihood method to maintain all correlations in the kinematics. The intensity and phase motion is mapped out for the contributing signals without requiring assumptions about the underlying processes. The strength of the PWA is in the analysis of the phase motion, which for resonance behavior is well defined. In the data presented, the ηπ– invariant mass spectrum shows contributions from the a0(980) and a2(1320) partial waves. No π1 was observed under a clear a2 signal after the angular distributions of the decay products were analyzed using an amplitude analysis. In addition, this dissertation discusses trends in the data, along with the implemented techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We argue that considering transitions at the same level as states, as first-class citizens, is advantageous in many cases. Namely, the use of atomic propositions on transitions, as well as on states, allows temporal formulas and strategies to be more powerful, general, and meaningful. We define egalitarian structures and logics, and show how they generalize well-known state-based, event-based, and mixed ones. We present translations from egalitarian to non-egalitarian settings that, in particular, allow the model checking of LTLR formulas using Maude’s LTL model checker. We have implemented these translations as a prototype in Maude itself.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Oxygen and carbon isotope analyses were performed on monospecific or mixed-species samples of benthic foraminifers, as well as on the planktonic species Globigerinoides ruber from a 24-m hydraulic piston core raised on the western flank of the Rio Grande Rise, at DSDP Site 517 (30°56.81'S and 38°02.47'W, water depth 2963 m) in the southwestern Atlantic. This site is presently located in the core of North Atlantic Deep Water (NADW). This is the first long isotopic record of Quaternary benthic foraminifers; it displays at least 30 isotopic stages, 25 of them readily correlated with the standard sequence of Pacific Core V28-239. The depths of both the Bruhnes/Matuyama boundary and the Jaramillo Event based on oxygen isotope stratigraphy agree well with paleomagnetic results. Quaternary faunal data from this part of the Atlantic are dated through isotopic stratigraphy and partially contradict data previously published by Williams and Ledbetter (1979). There was a substantial increase in the size of the earth's major ice sheets culminating at Stage 22 and corresponding to a l per mil progressive increase of d18O maximal values. Further, ice volume-induced isotopic changes were not identical for different glacial cycles. Oxygen and carbon isotope analyses of benthic foraminifers show that during Pleistocene glacial episodes, NADW was cooler than today and that Mediterranean outflow might still have contributed to the NADW sources. The comparison of coiling ratio changes of Globorotalia truncatulinoides with planktonic and benthic oxygen isotope records shows that there might have been southward excursions of the Brazil Current during the Pleistocene, perhaps related to Antarctic surface water surges. The question of the location of NADW sources during glacial maxima remains open.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Internet users consume online targeted advertising based on information collected about them and voluntarily share personal information in social networks. Sensor information and data from smart-phones is collected and used by applications, sometimes in unclear ways. As it happens today with smartphones, in the near future sensors will be shipped in all types of connected devices, enabling ubiquitous information gathering from the physical environment, enabling the vision of Ambient Intelligence. The value of gathered data, if not obvious, can be harnessed through data mining techniques and put to use by enabling personalized and tailored services as well as business intelligence practices, fueling the digital economy. However, the ever-expanding information gathering and use undermines the privacy conceptions of the past. Natural social practices of managing privacy in daily relations are overridden by socially-awkward communication tools, service providers struggle with security issues resulting in harmful data leaks, governments use mass surveillance techniques, the incentives of the digital economy threaten consumer privacy, and the advancement of consumergrade data-gathering technology enables new inter-personal abuses. A wide range of fields attempts to address technology-related privacy problems, however they vary immensely in terms of assumptions, scope and approach. Privacy of future use cases is typically handled vertically, instead of building upon previous work that can be re-contextualized, while current privacy problems are typically addressed per type in a more focused way. Because significant effort was required to make sense of the relations and structure of privacy-related work, this thesis attempts to transmit a structured view of it. It is multi-disciplinary - from cryptography to economics, including distributed systems and information theory - and addresses privacy issues of different natures. As existing work is framed and discussed, the contributions to the state-of-theart done in the scope of this thesis are presented. The contributions add to five distinct areas: 1) identity in distributed systems; 2) future context-aware services; 3) event-based context management; 4) low-latency information flow control; 5) high-dimensional dataset anonymity. Finally, having laid out such landscape of the privacy-preserving work, the current and future privacy challenges are discussed, considering not only technical but also socio-economic perspectives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Gap junction coupling is ubiquitous in the brain, particularly between the dendritic trees of inhibitory interneurons. Such direct non-synaptic interaction allows for direct electrical communication between cells. Unlike spike-time driven synaptic neural network models, which are event based, any model with gap junctions must necessarily involve a single neuron model that can represent the shape of an action potential. Indeed, not only do neurons communicating via gaps feel super-threshold spikes, but they also experience, and respond to, sub-threshold voltage signals. In this chapter we show that the so-called absolute integrate-and-fire model is ideally suited to such studies. At the single neuron level voltage traces for the model may be obtained in closed form, and are shown to mimic those of fast-spiking inhibitory neurons. Interestingly in the presence of a slow spike adaptation current the model is shown to support periodic bursting oscillations. For both tonic and bursting modes the phase response curve can be calculated in closed form. At the network level we focus on global gap junction coupling and show how to analyze the asynchronous firing state in large networks. Importantly, we are able to determine the emergence of non-trivial network rhythms due to strong coupling instabilities. To illustrate the use of our theoretical techniques (particularly the phase-density formalism used to determine stability) we focus on a spike adaptation induced transition from asynchronous tonic activity to synchronous bursting in a gap-junction coupled network.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação de Mestrado, Neurociências Cognitivas e Neuropsicologia, Faculdade de Ciências Humanas e Sociais, Universidade do Algarve, 2016

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The first goal of this study is to analyse a real-world multiproduct onshore pipeline system in order to verify its hydraulic configuration and operational feasibility by constructing a simulation model step by step from its elementary building blocks that permits to copy the operation of the real system as precisely as possible. The second goal is to develop this simulation model into a user-friendly tool that one could use to find an “optimal” or “best” product batch schedule for a one year time period. Such a batch schedule could change dynamically as perturbations occur during operation that influence the behaviour of the entire system. The result of the simulation, the ‘best’ batch schedule is the one that minimizes the operational costs in the system. The costs involved in the simulation are inventory costs, interface costs, pumping costs, and penalty costs assigned to any unforeseen situations. The key factor to determine the performance of the simulation model is the way time is represented. In our model an event based discrete time representation is selected as most appropriate for our purposes. This means that the time horizon is divided into intervals of unequal lengths based on events that change the state of the system. These events are the arrival/departure of the tanker ships, the openings and closures of loading/unloading valves of storage tanks at both terminals, and the arrivals/departures of trains/trucks at the Delivery Terminal. In the feasibility study we analyse the system’s operational performance with different Head Terminal storage capacity configurations. For these alternative configurations we evaluated the effect of different tanker ship delay magnitudes on the number of critical events and product interfaces generated, on the duration of pipeline stoppages, the satisfaction of the product demand and on the operative costs. Based on the results and the bottlenecks identified, we propose modifications in the original setup.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Questa ricerca si concentra sui modi di produzione e ricezione della teatralità nelle pratiche performative contemporanee con finalità estetiche. In particolare, sono indagate quelle pratiche che – all’interno di ecosistemi performátici – impiegano modalità di progettazione dell’azione ricorrendo a strategie e dispositivi di teatralizzazione dell’evento attraverso modelli immersivi co-partecipativi, intervenendo sui meccanismi semiocognitivi di interpretazione dello spettatore. Il concetto di ecosistemi performátici consente di pertinentizzare le differenti formazioni semiotiche che emergono dal continuum performativo della semiosfera, cogliendo i rapporti ecologici ed evolutivi che si instaurano diacronicamente tra le forme teatrali. Sono soprattutto le trasformazioni a essere comprese, restituendo all’analisi semiotica un’immagine delle arti performátiche dinamica e radicata nella cultura e nella società, e delle modalità in cui i meccanismi di base della teatralità prendono forma. Con approccio etnografico ecologico cognitivo, si affronta il tema della corporeità e dei regimi di presenza, introducendo nell’analisi relazionale il concetto di emplacement a integrazione della nozione di embodiment. È elaborato, inoltre, un modello autopoietico dell’enunciazione come atto di mostrazione, sulla metafora della “conversazione”. Nell’ecologia dell’ambiente performático tra attore e spettatore si crea un “campo interattivo”, nel quale si consuma l’enunciazione teatrale. Attraverso casi studio, si illustra come le esperienze immersive co-partecipative scardinano e riconfigurano l’insieme di norme e usi naturalizzati nella tradizione teatrale occidentale del dramma. Si giunge, infine, a concepire la relazione tra frontalità e immersività non in termini di opposizione tra contrari, bensì in rapporto di continuità quale costante del discorso performático soggetta a multiformi gradazioni. Quella tra attore e spettatore è una interazione, un dialogo, che non si gioca sulla relazione frontalità/immersività bensì su quella interattività/non-interattività dalla cui articolazione emergono le differenti e cangianti forme teatrali che popolano e popoleranno gli ecosistemi performátici.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Spiking Neural Networks (SNNs) are bio-inspired Artificial Neural Networks (ANNs) utilizing discrete spiking signals, akin to neuron communication in the brain, making them ideal for real-time and energy-efficient Cyber-Physical Systems (CPSs). This thesis explores their potential in Structural Health Monitoring (SHM), leveraging low-cost MEMS accelerometers for early damage detection in motorway bridges. The study focuses on Long Short-Term SNNs (LSNNs), although their complex learning processes pose challenges. Comparing LSNNs with other ANN models and training algorithms for SHM, findings indicate LSNNs' effectiveness in damage identification, comparable to ANNs trained using traditional methods. Additionally, an optimized embedded LSNN implementation demonstrates a 54% reduction in execution time, but with longer pre-processing due to spike-based encoding. Furthermore, SNNs are applied in UAV obstacle avoidance, trained directly using a Reinforcement Learning (RL) algorithm with event-based input from a Dynamic Vision Sensor (DVS). Performance evaluation against Convolutional Neural Networks (CNNs) highlights SNNs' superior energy efficiency, showing a 6x decrease in energy consumption. The study also investigates embedded SNN implementations' latency and throughput in real-world deployments, emphasizing their potential for energy-efficient monitoring systems. This research contributes to advancing SHM and UAV obstacle avoidance through SNNs' efficient information processing and decision-making capabilities within CPS domains.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Reporter genes are routinely used in every laboratory for molecular and cellular biology for studying heterologous gene expression and general cellular biological mechanisms, such as transfection processes. Although well characterized and broadly implemented, reporter genes present serious limitations, either by involving time-consuming procedures or by presenting possible side effects on the expression of the heterologous gene or even in the general cellular metabolism. Fourier transform mid-infrared (FT-MIR) spectroscopy was evaluated to simultaneously analyze in a rapid (minutes) and high-throughput mode (using 96-wells microplates), the transfection efficiency, and the effect of the transfection process on the host cell biochemical composition and metabolism. Semi-adherent HEK and adherent AGS cell lines, transfected with the plasmid pVAX-GFP using Lipofectamine, were used as model systems. Good partial least squares (PLS) models were built to estimate the transfection efficiency, either considering each cell line independently (R 2 ≥ 0.92; RMSECV ≤ 2 %) or simultaneously considering both cell lines (R 2 = 0.90; RMSECV = 2 %). Additionally, the effect of the transfection process on the HEK cell biochemical and metabolic features could be evaluated directly from the FT-IR spectra. Due to the high sensitivity of the technique, it was also possible to discriminate the effect of the transfection process from the transfection reagent on KEK cells, e.g., by the analysis of spectral biomarkers and biochemical and metabolic features. The present results are far beyond what any reporter gene assay or other specific probe can offer for these purposes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Introduction: Non-invasive brain imaging techniques often contrast experimental conditions across a cohort of participants, obfuscating distinctions in individual performance and brain mechanisms that are better characterised by the inter-trial variability. To overcome such limitations, we developed topographic analysis methods for single-trial EEG data [1]. So far this was typically based on time-frequency analysis of single-electrode data or single independent components. The method's efficacy is demonstrated for event-related responses to environmental sounds, hitherto studied at an average event-related potential (ERP) level. Methods: Nine healthy subjects participated to the experiment. Auditory meaningful sounds of common objects were used for a target detection task [2]. On each block, subjects were asked to discriminate target sounds, which were living or man-made auditory objects. Continuous 64-channel EEG was acquired during the task. Two datasets were considered for each subject including single-trial of the two conditions, living and man-made. The analysis comprised two steps. In the first part, a mixture of Gaussians analysis [3] provided representative topographies for each subject. In the second step, conditional probabilities for each Gaussian provided statistical inference on the structure of these topographies across trials, time, and experimental conditions. Similar analysis was conducted at group-level. Results: Results show that the occurrence of each map is structured in time and consistent across trials both at the single-subject and at group level. Conducting separate analyses of ERPs at single-subject and group levels, we could quantify the consistency of identified topographies and their time course of activation within and across participants as well as experimental conditions. A general agreement was found with previous analysis at average ERP level. Conclusions: This novel approach to single-trial analysis promises to have impact on several domains. In clinical research, it gives the possibility to statistically evaluate single-subject data, an essential tool for analysing patients with specific deficits and impairments and their deviation from normative standards. In cognitive neuroscience, it provides a novel tool for understanding behaviour and brain activity interdependencies at both single-subject and at group levels. In basic neurophysiology, it provides a new representation of ERPs and promises to cast light on the mechanisms of its generation and inter-individual variability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.