14 resultados para Temporal information

em CentAUR: Central Archive University of Reading - UK


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Drought characterisation is an intrinsically spatio-temporal problem. A limitation of previous approaches to characterisation is that they discard much of the spatio-temporal information by reducing events to a lower-order subspace. To address this, an explicit 3-dimensional (longitude, latitude, time) structure-based method is described in which drought events are defined by a spatially and temporarily coherent set of points displaying standardised precipitation below a given threshold. Geometric methods can then be used to measure similarity between individual drought structures. Groupings of these similarities provide an alternative to traditional methods for extracting recurrent space-time signals from geophysical data. The explicit consideration of structure encourages the construction of summary statistics which relate to the event geometry. Example measures considered are the event volume, centroid, and aspect ratio. The utility of a 3-dimensional approach is demonstrated by application to the analysis of European droughts (15 °W to 35°E, and 35 °N to 70°N) for the period 1901–2006. Large-scale structure is found to be abundant with 75 events identified lasting for more than 3 months and spanning at least 0.5 × 106 km2. Near-complete dissimilarity is seen between the individual drought structures, and little or no regularity is found in the time evolution of even the most spatially similar drought events. The spatial distribution of the event centroids and the time evolution of the geographic cross-sectional areas strongly suggest that large area, sustained droughts result from the combination of multiple small area (∼106 km2) short duration (∼3 months) events. The small events are not found to occur independently in space. This leads to the hypothesis that local water feedbacks play an important role in the aggregation process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Establishing the mechanisms by which microbes interact with their environment, including eukaryotic hosts, is a major challenge that is essential for the economic utilisation of microbes and their products. Techniques for determining global gene expression profiles of microbes, such as microarray analyses, are often hampered by methodological restraints, particularly the recovery of bacterial transcripts (RNA) from complex mixtures and rapid degradation of RNA. A pioneering technology that avoids this problem is In Vivo Expression Technology (IVET). IVET is a 'promoter-trapping' methodology that can be used to capture nearly all bacterial promoters (genes) upregulated during a microbe-environment interaction. IVET is especially useful because there is virtually no limit to the type of environment used (examples to date include soil, oomycete, a host plant or animal) to select for active microbial promoters. Furthermore, IVET provides a powerful method to identify genes that are often overlooked during genomic annotation, and has proven to be a flexible technology that can provide even more information than identification of gene expression profiles. A derivative of IVET, termed resolvase-IVET (RIVET), can be used to provide spatio-temporal information about environment-specific gene expression. More recently, niche-specific genes captured during an IVET screen have been exploited to identify the regulatory mechanisms controlling their expression. Overall, IVET and its various spin-offs have proven to be a valuable and robust set of tools for analysing microbial gene expression in complex environments and providing new targets for biotechnological development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Terahertz (THz) frequency radiation, 0.1 THz to 20 THz, is being investigated for biomedical imaging applications following the introduction of pulsed THz sources that produce picosecond pulses and function at room temperature. Owing to the broadband nature of the radiation, spectral and temporal information is available from radiation that has interacted with a sample; this information is exploited in the development of biomedical imaging tools and sensors. In this work, models to aid interpretation of broadband THz spectra were developed and evaluated. THz radiation lies on the boundary between regions best considered using a deterministic electromagnetic approach and those better analysed using a stochastic approach incorporating quantum mechanical effects, so two computational models to simulate the propagation of THz radiation in an absorbing medium were compared. The first was a thin film analysis and the second a stochastic Monte Carlo model. The Cole–Cole model was used to predict the variation with frequency of the physical properties of the sample and scattering was neglected. The two models were compared with measurements from a highly absorbing water-based phantom. The Monte Carlo model gave a prediction closer to experiment over 0.1 to 3 THz. Knowledge of the frequency-dependent physical properties, including the scattering characteristics, of the absorbing media is necessary. The thin film model is computationally simple to implement but is restricted by the geometry of the sample it can describe. The Monte Carlo framework, despite being initially more complex, provides greater flexibility to investigate more complicated sample geometries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of perceptual decision making often assume that sensory evidence is accumulated over time in favor of the various possible decisions, until the evidence in favor of one of them outweighs the evidence for the others. Saccadic eye movements are among the most frequent perceptual decisions that the human brain performs. We used stochastic visual stimuli to identify the temporal impulse response underlying saccadic eye movement decisions. Observers performed a contrast search task, with temporal variability in the visual signals. In experiment 1, we derived the temporal filter observers used to integrate the visual information. The integration window was restricted to the first similar to 100 ms after display onset. In experiment 2, we showed that observers cannot perform the task if there is no useful information to distinguish the target from the distractor within this time epoch. We conclude that (1) observers did not integrate sensory evidence up to a criterion level, (2) observers did not integrate visual information up to the start of the saccadic dead time, and (3) variability in saccade latency does not correspond to variability in the visual integration period. Instead, our results support a temporal filter model of saccadic decision making. The temporal impulse response identified by our methods corresponds well with estimates of integration times of V1 output neurons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interference with time estimation from concurrent nontemporal processing has been shown to depend on the short-term memory requirements of the concurrent task (Fortin Breton, 1995; Fortin, Rousseau, Bourque, & Kirouac, 1993). In particular, it has been claimed that active processing of information in short-term memory produces interference, whereas simply maintaining information does not. Here, four experiments are reported in which subjects were trained to produce a 2,500-msec interval and then perform concurrent memory tasks. Interference with timing was demonstrated for concurrent memory tasks involving only maintenance. In one experiment, increasing set size in a pitch memory task systematically lengthened temporal production. Two further experiments suggested that this was due to a specific interaction between the short-term memory requirements of the pitch task and those of temporal production. In the final experiment, subjects performed temporal production while concurrently remembering the durations of a set of tones. Interference with interval production was comparable to that produced by the pitch memory task. Results are discussed in terms of a pacemaker-counter model of temporal processing, in which the counter component is supported by short-term memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The externally recorded electroencephalogram (EEG) is contaminated with signals that do not originate from the brain, collectively known as artefacts. Thus, EEG signals must be cleaned prior to any further analysis. In particular, if the EEG is to be used in online applications such as Brain-Computer Interfaces (BCIs) the removal of artefacts must be performed in an automatic manner. This paper investigates the robustness of Mutual Information based features to inter-subject variability for use in an automatic artefact removal system. The system is based on the separation of EEG recordings into independent components using a temporal ICA method, RADICAL, and the utilisation of a Support Vector Machine for classification of the components into EEG and artefact signals. High accuracy and robustness to inter-subject variability is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information modelling is a topic that has been researched a great deal, but still many questions around it have not been solved. An information model is essential in the design of a database which is the core of an information system. Currently most of databases only deal with information that represents facts, or asserted information. The ability of capturing semantic aspect has to be improved, and yet other types, such as temporal and intentional information, should be considered. Semantic Analysis, a method of information modelling, has offered a way to handle various aspects of information. It employs the domain knowledge and communication acts as sources of information modelling. It lends itself to a uniform structure whereby semantic, temporal and intentional information can be captured, which builds a sound foundation for building a semantic temporal database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Salmonella is the second most commonly reported human foodborne pathogen in England and Wales, and antimicrobial-resistant strains of Salmonella are an increasing problem in both human and veterinary medicine. In this work we used a generalized linear spatial model to estimate the spatial and temporal patterns of antimicrobial resistance in Salmonella Typhimurium in England and Wales. Of the antimicrobials considered we found a common peak in the probability that an S. Typhimurium incident will show resistance to a given antimicrobial in late spring and in mid to late autumn; however, for one of the antimicrobials (streptomycin) there was a sharp drop, over the last 18 months of the period of investigation, in the probability of resistance. We also found a higher probability of resistance in North Wales which is consistent across the antimicrobials considered. This information contributes to our understanding of the epidemiology of antimicrobial resistance in Salmonella.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we will address the endeavors of three disciplines, Psychology, Neuroscience, and Artificial Neural Network (ANN) modeling, in explaining how the mind perceives and attends information. More precisely, we will shed some light on the efforts to understand the allocation of attentional resources to the processing of emotional stimuli. This review aims at informing the three disciplines about converging points of their research and to provide a starting point for discussion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Twitter network has been labelled the most commonly used microblogging application around today. With about 500 million estimated registered users as of June, 2012, Twitter has become a credible medium of sentiment/opinion expression. It is also a notable medium for information dissemination; including breaking news on diverse issues since it was launched in 2007. Many organisations, individuals and even government bodies follow activities on the network in order to obtain knowledge on how their audience reacts to tweets that affect them. We can use postings on Twitter (known as tweets) to analyse patterns associated with events by detecting the dynamics of the tweets. A common way of labelling a tweet is by including a number of hashtags that describe its contents. Association Rule Mining can find the likelihood of co-occurrence of hashtags. In this paper, we propose the use of temporal Association Rule Mining to detect rule dynamics, and consequently dynamics of tweets. We coined our methodology Transaction-based Rule Change Mining (TRCM). A number of patterns are identifiable in these rule dynamics including, new rules, emerging rules, unexpected rules and ?dead' rules. Also the linkage between the different types of rule dynamics is investigated experimentally in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last two decades substantial advances have been made in the understanding of the scientific basis of urban climates. These are reviewed here with attention to sustainability of cities, applications that use climate information, and scientific understanding in relation to measurements and modelling. Consideration is given from street (micro) scale to neighbourhood (local) to city and region (meso) scale. Those areas where improvements are needed in the next decade to ensure more sustainable cities are identified. High-priority recommendations are made in the following six strategic areas: observations, data, understanding, modelling, tools and education. These include the need for more operational urban measurement stations and networks; for an international data archive to aid translation of research findings into design tools, along with guidelines for different climate zones and land uses; to develop methods to analyse atmospheric data measured above complex urban surfaces; to improve short-range, high-resolution numerical prediction of weather, air quality and chemical dispersion through improved modelling of the biogeophysical features of the urban land surface; to improve education about urban meteorology; and to encourage communication across scientific disciplines at a range of spatial and temporal scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Though anthropogenic impacts on boundary layer climates are expected to be large in dense urban areas, to date very few studies of energy flux observations are available. We report on 3.5 years of measurements gathered in central London, UK. Radiometer and eddy covariance observations at two adjacent sites, at different heights, were analysed at various temporal scales and with respect to meteorological conditions, such as cloud cover. Although the evaporative flux is generally small due to low moisture availability and a predominately impervious surface, the enhancement following rainfall usually lasts for 12–18 h. As both the latent and sensible heat fluxes are larger in the afternoon, they maintain a relatively consistent Bowen ratio throughout the middle of the day. Strong storage and anthropogenic heat fluxes sustain high and persistently positive sensible heat fluxes. At the monthly time scale, the urban surface often loses more energy by this turbulent heat flux than is gained from net all-wave radiation. Auxiliary anthropogenic heat flux information suggest human activities in the study area are sufficient to provide this energy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-based observations of dayside auroral forms and magnetic perturbations in the arctic sectors of Svalbard and Greenland, in combination with the high-resolution measurements of ionospheric ion drift and temperature by the EISCAT radar, are used to study temporal/spatial structures of cusp-type auroral forms in relation to convection. Large-scale patterns of equivalent convection in the dayside polar ionosphere are derived from the magnetic observations in Greenland and Svalbard. This information is used to estimate the ionospheric convection pattern in the vicinity of the cusp/cleft aurora. The reported observations, covering the period 0700-1130 UT, on January 11, 1993, are separated into four intervals according to the observed characteristics of the aurora and ionospheric convection. The morphology and intensity of the aurora are very different in quiet and disturbed intervals. A latitudinally narrow zone of intense and dynamical 630.0 nm emission equatorward of 75 degrees MLAT, was observed during periods of enhanced antisunward convection in the cusp region. This (type 1 cusp aurora) is considered to be the signature of plasma entry via magnetopause reconnection at low magnetopause latitudes, i.e. the low-latitude boundary layer (LLB I,). Another zone of weak 630.0 nm emission (type 2 cusp aurora) was observed to extend up to high latitudes (similar to 79 degrees MLAT) during relatively quiet magnetic conditions, when indications of reverse (sunward) convection was observed in the dayside polar cap. This is postulated to be a signature of merging between a northward directed IMF (B-z > 0) and the geomagnetic field poleward of the cusp. The coexistence of type 1 and 2 auroras was observed under intermediate circumstances. The optical observations from Svalbard and Greenland were also used to determine the temporal and spatial evolution of type 1 auroral forms, i.e. poleward-moving auroral events occurring in the vicinity of a rotational convection reversal in the early post-noon sector. Each event appeared as a local brightening at the equatorward boundary of the pre-existing type 1 cusp aurora, followed by poleward and eastward expansions of luminosity. The auroral events were associated with poleward-moving surges of enhanced ionospheric convection and F-layer ion temperature as observed by the EISCAT radar in Tromso. The EISCAT ion flow data in combination with the auroral observations show strong evidence for plasma flow across the open/closed field line boundary.