882 resultados para Direction of time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

M. Neal, An Artificial Immune System for Continuous Analysis of Time-Varying Data, in Proceedings of the 1st International Conference on Artificial Immune Systems (ICARIS), 2002, eds J Timmis and P J Bentley, volume 1, pages 76-85,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I. Miguel and Q. Shen. Exhibiting the behaviour of time-delayed systems via an extension to qualitative simulation. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 35(2):298-305, 2005.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the present article is to analyse the Apology in its aspect of time. When defending himself against the charges, Socrates appeals to the past, the present and the future. Furthermore, the philosopher stresses the meaning of the duration of time. Thus, the seems to suggest that all really important activities demand a long time to benefit, since they are almost invariably connected with greater efforts. While the dialogue proves thereby to be an ethical one, the various time expressions also gain an ethical dimension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a Lyapunov function candidate is introduced for multivariable systems with inner delays, without assuming a priori stability for the nondelayed subsystem. By using this Lyapunov function, a controller is deduced. Such a controller utilizes an input-output description of the original system, a circumstance that facilitates practical applications of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After the 1980s it is diffi cult, following stylistic criteria, to draw a map of contemporary academic music. All styles are compossible, and all are practiced. In this context, the geographical entity “South of Italy” does not stand out for a musical identity with special technical-stylistic features. Rather, at a socio-cultural level, the South remains today – in music no less than in all areas where there is a gap between top development and stagnation – a land of emigrants: six out of the seven composers treated (Ivan Fedele, Giuseppe Colardo, Rosario Mirigliano, Giuseppe Soccio, Nicola Cisternino, Biagio Putignano, Paolo Aralla) live in the North of Italy. The positive aspect of this is the affi nity of the South with the transnational and superstructural community of contemporary music, which from European and Western has now become almost global. The composers under consideration belong to the generation of the ‘50s, rooted in the serial and post-serial movements (from which Franco Donatoni, Luciano Berio, Luigi Nono, Salvatore Sciarrino, Giacinto Scelsi, are the principals models, to mention only the Italians), dipped in the general phenomenon of timbrism (particularly spectralism), and acquainted with electronics. They draw from these sources various instruments of compositional technique and aspects of their poetics. In particular these composers, active from the ‘80s, develop new ways of construction of the temporal form of music. They share the goal to establish a new continuity, different from the tonal one but at the same time transcending the serial and post-serial disintegration and fragmentation. The primary means to this end is a new enhancement of the category of fi gure, as a clear and distinct, recognizable aggregate of pitches, intervals, register, durations, timbre, articulation, dynamics, and texture. Each composer elaborates the atonal fi gural material in different ways, emphasizing one aspect or another. For example, Fedele (1953) is a master in the management of form per se, Colardo (1953) in the activation of disturbed harmonic effects, Mirigliano (1950) in the creation of a slight tension from the smallest vibrations of sound, Soccio (1950) in the set up of movement by means of accumulations and discharges of energy, Cisternino (1957) in a Cagean-Scelsian emphasis on sound as such, Putignano (1960) in the suspension of time through the succession and transformation of images, Aralla (1960) in the foundation of form from below, from the concreteness of sound.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A natural approach to representing and reasoning about temporal propositions (i.e., statements with time-dependent truth-values) is to associate them with time elements. In the literature, there are three choices regarding the primitive for the ontology of time: (1) instantaneous points, (2) durative intervals and (3) both points and intervals. Problems may arise when one conflates different views of temporal structure and questions whether some certain types of temporal propositions can be validly and meaningfully associated with different time elements. In this paper, we shall summarize an ontological glossary with respect to time elements, and diversify a wider range of meta-predicates for ascribing temporal propositions to time elements. Based on these, we shall also devise a versatile categorization of temporal propositions, which can subsume those representative categories proposed in the literature, including that of Vendler, of McDermott, of Allen, of Shoham, of Galton and of Terenziani and Torasso. It is demonstrated that the new categorization of propositions, together with the proposed range of meta-predicates, provides the expressive power for modeling some typical temporal terms/phenomena, such as starting-instant, stopping-instant, dividing-instant, instigation, termination and intermingling etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time-series and sequences are important patterns in data mining. Based on an ontology of time-elements, this paper presents a formal characterization of time-series and state-sequences, where a state denotes a collection of data whose validation is dependent on time. While a time-series is formalized as a vector of time-elements temporally ordered one after another, a state-sequence is denoted as a list of states correspondingly ordered by a time-series. In general, a time-series and a state-sequence can be incomplete in various ways. This leads to the distinction between complete and incomplete time-series, and between complete and incomplete state-sequences, which allows the expression of both absolute and relative temporal knowledge in data mining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Digital Art Weeks PROGRAM (DAW06) is concerned with the application of digital technology in the arts. Consisting again this year of symposium, workshops and performances, the program offers insight into current research and innovations in art and technology as well as illustrating resulting synergies in a series of performances, making artists aware of impulses in technology and scientists aware of the possibilities of the application of technology in the arts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study information rates of time-varying flat-fading channels (FFC) modeled as finite-state Markov channels (FSMC). FSMCs have two main applications for FFCs: modeling channel error bursts and decoding at the receiver. Our main finding in the first application is that receiver observation noise can more adversely affect higher-order FSMCs than lower-order FSMCs, resulting in lower capacities. This is despite the fact that the underlying higher-order FFC and its corresponding FSMC are more predictable. Numerical analysis shows that at low to medium SNR conditions (SNR lsim 12 dB) and at medium to fast normalized fading rates (0.01 lsim fDT lsim 0.10), FSMC information rates are non-increasing functions of memory order. We conclude that BERs obtained by low-order FSMC modeling can provide optimistic results. To explain the capacity behavior, we present a methodology that enables analytical comparison of FSMC capacities with different memory orders. We establish sufficient conditions that predict higher/lower capacity of a reduced-order FSMC, compared to its original high-order FSMC counterpart. Finally, we investigate the achievable information rates in FSMC-based receivers for FFCs. We observe that high-order FSMC modeling at the receiver side results in a negligible information rate increase for normalized fading rates fDT lsim 0.01.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2000 a Review of Current Marine Observations in relation to present and future needs was undertaken by the Inter-Agency Committee for Marine Science and Technology (IACMST). The Marine Environmental Change Network (MECN) was initiated in 2002 as a direct response to the recommendations of the report. A key part of the current phase of the MECN is to ensure that information from the network is provided to policy makers and other end-users to enable them to produce more accurate assessments of ecosystem state and gain a clearer understanding of factors influencing change in marine ecosystems. The MECN holds workshops on an annual basis, bringing together partners maintaining time-series and long-term datasets as well as end-users interested in outputs from the network. It was decided that the first workshop of the MECN continuation phase should consist of an evaluation of the time series and data sets maintained by partners in the MECN with regard to their ‘fit for purpose’ for answering key science questions and informing policy development. This report is based on the outcomes of the workshop. Section one of the report contains a brief introduction to monitoring, time series and long-term datasets. The various terms are defined and the need for MECN type data to complement compliance monitoring programmes is discussed. Outlines are also given of initiatives such as the United Kingdom Marine Monitoring and Assessment Strategy (UKMMAS) and Oceans 2025. Section two contains detailed information for each of the MECN time series / long-term datasets including information on scientific outputs and current objectives. This information is mainly based on the presentations given at the workshop and therefore follows a format whereby the following headings are addressed: Origin of time series including original objectives; current objectives; policy relevance; products (advice, publications, science and society). Section three consists of comments made by the review panel concerning all the time series and the network. Needs or issues highlighted by the panel with regard to the future of long-term datasets and time-series in the UK are shown along with advice and potential solutions where offered. The recommendations are divided into 4 categories; ‘The MECN and end-user requirements’; ‘Procedures & protocols’; ‘Securing data series’ and ‘Future developments’. Ever since marine environmental protection issues really came to the fore in the 1960s, it has been recognised that there is a requirement for a suitable evidence base on environmental change in order to support policy and management for UK waters. Section four gives a brief summary of the development of marine policy in the UK along with comments on the availability and necessity of long-term marine observations for the implementation of this policy. Policy relating to three main areas is discussed; Marine Conservation (protecting biodiversity and marine ecosystems); Marine Pollution and Fisheries. The conclusion of this section is that there has always been a specific requirement for information on long-term change in marine ecosystems around the UK in order to address concerns over pollution, fishing and general conservation. It is now imperative that this need is addressed in order for the UK to be able to fulfil its policy commitments and manage marine ecosystems in the light of climate change and other factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The contract work has demonstrated that older data can be assessed and entered into the MR format. Older data has associated problems but is retrievable. The contract successfully imported all datasets as required. MNCR survey sheets fit well into the MR format. The data validation and verification process can be improved. A number of computerised short cuts can be suggested and the process made more intuitive. Such a move is vital if MR is to be adopted as a standard by the recording community both on a voluntary level and potentially by consultancies.