33 resultados para merging

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The background error covariance matrix, B, is often used in variational data assimilation for numerical weather prediction as a static and hence poor approximation to the fully dynamic forecast error covariance matrix, Pf. In this paper the concept of an Ensemble Reduced Rank Kalman Filter (EnRRKF) is outlined. In the EnRRKF the forecast error statistics in a subspace defined by an ensemble of states forecast by the dynamic model are found. These statistics are merged in a formal way with the static statistics, which apply in the remainder of the space. The combined statistics may then be used in a variational data assimilation setting. It is hoped that the nonlinear error growth of small-scale weather systems will be accurately captured by the EnRRKF, to produce accurate analyses and ultimately improved forecasts of extreme events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses how numerical gradient estimation methods may be used in order to reduce the computational demands on a class of multidimensional clustering algorithms. The study is motivated by the recognition that several current point-density based cluster identification algorithms could benefit from a reduction of computational demand if approximate a-priori estimates of the cluster centres present in a given data set could be supplied as starting conditions for these algorithms. In this particular presentation, the algorithm shown to benefit from the technique is the Mean-Tracking (M-T) cluster algorithm, but the results obtained from the gradient estimation approach may also be applied to other clustering algorithms and their related disciplines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mergers of Higher Education Institutions (HEIs) are organisational processes requiring tremendous amount of resources, in terms of time, work, and money. A number of mergers have been seen on previous years and more are to come. Several studies on mergers have been conducted, revealing some crucial factors that affect the success of mergers. Based on literature review on these studies, factors are: the initiator of merger, a reason for merger, geographical distance of merging institutions, organisational culture, the extend of overlapping course portfolio, and Quality Assurance Systems (QASs). Usually these kind of factors are not considered on mergers, but focus is on financial matters. In this paper, a framework (HMEF) for evaluating merging of HEIs is introduced. HMEF is based on Enterprise Architecture (EA), focusing on factors found to be affecting the success of mergers. By using HMEF, HEIs can focus on matters that crucial for merging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Polar spacecraft passed through a region near the dayside magnetopause on May 29, 1996, at a geocentric distance of similar to 8 R-E and high, northern magnetic latitudes. The interplanetary magnetic field (IMF) was northward during the pass. Data from the Thermal Ion Dynamics Experiment revealed the existence of low-speed (similar to 50 km s(-1)) ion D-shaped distributions mixed with cold ions (similar to 2 eV) over a period of 2.5 hours. These ions were traveling parallel to the magnetic field toward the Northern Hemisphere ionosphere and were convecting primarily eastward. The D-shaped distributions are distinct from a convecting Maxwellian and, along with the magnetic field direction, are taken as evidence that the spacecraft was inside the magnetosphere and not in the magnetosheath. Furthermore, the absence of ions in the antiparallel direction is taken as evidence that low-shear merging was occurring at a location southward of the spacecraft and equatorward of the Southern Hemisphere cusp. The cold ions were of ionospheric origin, with initially slow field-aligned speeds, which were accelerated upon reflection from the magnetopause. These observations provide significant new evidence consistent with component magnetic merging sites equatorward of the cusp for northward IMF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The past years have shown an enormous advancement in sequencing and array-based technologies, producing supplementary or alternative views of the genome stored in various formats and databases. Their sheer volume and different data scope pose a challenge to jointly visualize and integrate diverse data types. We present AmalgamScope a new interactive software tool focusing on assisting scientists with the annotation of the human genome and particularly the integration of the annotation files from multiple data types, using gene identifiers and genomic coordinates. Supported platforms include next-generation sequencing and microarray technologies. The available features of AmalgamScope range from the annotation of diverse data types across the human genome to integration of the data based on the annotational information and visualization of the merged files within chromosomal regions or the whole genome. Additionally, users can define custom transcriptome library files for any species and use the file exchanging distant server options of the tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method is proposed for merging different nadir-sounding climate data records using measurements from high-resolution limb sounders to provide a transfer function between the different nadir measurements. The two nadir-sounding records need not be overlapping so long as the limb-sounding record bridges between them. The method is applied to global-mean stratospheric temperatures from the NOAA Climate Data Records based on the Stratospheric Sounding Unit (SSU) and the Advanced Microwave Sounding Unit-A (AMSU), extending the SSU record forward in time to yield a continuous data set from 1979 to present, and providing a simple framework for extending the SSU record into the future using AMSU. SSU and AMSU are bridged using temperature measurements from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS), which is of high enough vertical resolution to accurately represent the weighting functions of both SSU and AMSU. For this application, a purely statistical approach is not viable since the different nadir channels are not sufficiently linearly independent, statistically speaking. The near-global-mean linear temperature trends for extended SSU for 1980–2012 are −0.63 ± 0.13, −0.71 ± 0.15 and −0.80 ± 0.17 K decade−1 (95 % confidence) for channels 1, 2 and 3, respectively. The extended SSU temperature changes are in good agreement with those from the Microwave Limb Sounder (MLS) on the Aura satellite, with both exhibiting a cooling trend of ~ 0.6 ± 0.3 K decade−1 in the upper stratosphere from 2004 to 2012. The extended SSU record is found to be in agreement with high-top coupled atmosphere–ocean models over the 1980–2012 period, including the continued cooling over the first decade of the 21st century.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper describes a field study focused on the dispersion of a traffic-related pollutant within an area close to a busy intersection between two street canyons in Central London. Simultaneous measurements of airflow, traffic flow and carbon monoxide concentrations ([CO]) are used to explore the causes of spatial variability in [CO] over a full range of background wind directions. Depending on the roof-top wind direction, evidence of both flow channelling and recirculation regimes were identified from data collected within the main canyon and the intersection. However, at the intersection, the merging of channelled flows from the canyons increased the flow complexity and turbulence intensity. These features, coupled with the close proximity of nearby queuing traffic in several directions, led to the highest overall time-average measured [CO] occurring at the intersection. Within the main street canyon, the data supported the presence of a helical flow regime for oblique roof-top flows, leading to increased [CO] on the canyon leeward side. Predominant wind directions led to some locations having significantly higher diurnal average [CO] due to being mostly on the canyon leeward side during the study period. For all locations, small changes in the background wind direction could cause large changes in the in-street mean wind angle and local turbulence intensity, implying that dispersion mechanisms would be highly sensitive to small changes in above roof flows. During peak traffic flow periods, concentrations within parallel side streets were approximately four times lower than within the main canyon and intersection which has implications for controlling personal exposure. Overall, the results illustrate that pollutant concentrations can be highly spatially variable over even short distances within complex urban geometries, and that synoptic wind patterns, traffic queue location and building topologies all play a role in determining where pollutant hot spots occur.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article begins by identifying a close relationship between the image of children generated by several sociologists working within the new sociology of childhood perspective and the claims and ambitions of the proponents of children's autonomy rights. The image of the child as a competent, self-controlled human agent are then subjected to observation from the perspective of Niklas Luhmann's social systems theory. The new sociology of childhood's constructivist approach is compared and contrasted with Niklas Luhmann's theory of 'operational constructivism'. The article applies tenets of Luhmann's theory, to the emergence of the new childhood sociologist's image of the child as a competent, self-controlled social agent, to the epistemological status of this image and, in particular, to claims that it derives from scientific endeavour. The article proceeds to identify two theoretical developments within sociology - sociology of identity and social agency - which have brought about fundamental changes in what may be considered 'sociological' and so 'scientific' and paved the way for sociological communications about what children,really are'. In conclusion, it argues that the merging of sociology with polemics, ideology, opinion and personal beliefs and, at the level of social systems, between science and politics represents in Luhmann's terms 'dedifferentiation'- a tendency he claims may have serious adverse consequences for modern society. This warning is applied to the scientific status of sociology - its claim to be able to produce 'facts' for society, upon which social systems, such as politics and law, may rely. Like the mass media, sociology may now be capable of producing only information, and not facts, about children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The eukaryotic genome is a mosaic of eubacterial and archaeal genes in addition to those unique to itself. The mosaic may have arisen as the result of two prokaryotes merging their genomes, or from genes acquired from an endosymbiont of eubacterial origin. A third possibility is that the eukaryotic genome arose from successive events of lateral gene transfer over long periods of time. This theory does not exclude the endosymbiont, but questions whether it is necessary to explain the peculiar set of eukaryotic genes. We use phylogenetic studies and reconstructions of ancestral first appearances of genes on the prokaryotic phylogeny to assess evidence for the lateral gene transfer scenario. We find that phylogenies advanced to support fusion can also arise from a succession of lateral gene transfer events. Our reconstructions of ancestral first appearances of genes reveal that the various genes that make up the eukaryotic mosaic arose at different times and in diverse lineages on the prokaryotic tree, and were not available in a single lineage. Successive events of lateral gene transfer can explain the unusual mosaic structure of the eukaryotic genome, with its content linked to the immediate adaptive value of the genes its acquired. Progress in understanding eukaryotes may come from identifying ancestral features such as the eukaryotic splicesome that could explain why this lineage invaded, or created, the eukaryoticniche.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The human electroencephalogram (EEG) is globally characterized by a 1/f power spectrum superimposed with certain peaks, whereby the "alpha peak" in a frequency range of 8-14 Hz is the most prominent one for relaxed states of wakefulness. We present simulations of a minimal dynamical network model of leaky integrator neurons attached to the nodes of an evolving directed and weighted random graph (an Erdos-Renyi graph). We derive a model of the dendritic field potential (DFP) for the neurons leading to a simulated EEG that describes the global activity of the network. Depending on the network size, we find an oscillatory transition of the simulated EEG when the network reaches a critical connectivity. This transition, indicated by a suitably defined order parameter, is reflected by a sudden change of the network's topology when super-cycles are formed from merging isolated loops. After the oscillatory transition, the power spectra of simulated EEG time series exhibit a 1/f continuum superimposed with certain peaks. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-world text classification tasks often suffer from poor class structure with many overlapping classes and blurred boundaries. Training data pooled from multiple sources tend to be inconsistent and contain erroneous labelling, leading to poor performance of standard text classifiers. The classification of health service products to specialized procurement classes is used to examine and quantify the extent of these problems. A novel method is presented to analyze the labelled data by selectively merging classes where there is not enough information for the classifier to distinguish them. Initial results show the method can identify the most problematic classes, which can be used either as a focus to improve the training data or to merge classes to increase confidence in the predicted results of the classifier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the paper is to identify and describe differences in cognitive structures between consumer segments with differing levels of acceptance of genetically modified (GM) food. Among a sample of 60 mothers three segments are distinguished with respect to purchase intentions for GM yogurt: non-buyers, maybe-buyers and likely-buyers. A homogeneity test for the elicited laddering data suggests merging maybe- and likely-buyers, yielding two segments termed accepters and rejecters. Still, overlap between the segments’ cognitive structures is considerable, in particular with respect to a health focus in the evaluation of perceived consequences and ambivalence in technology assessment. Distinct differences are found in the assessment of benefits offered by GM food and the importance of values driving product evaluation and thus purchase decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling spatial covariance is an essential part of all geostatistical methods. Traditionally, parametric semivariogram models are fit from available data. More recently, it has been suggested to use nonparametric correlograms obtained from spatially complete data fields. Here, both estimation techniques are compared. Nonparametric correlograms are shown to have a substantial negative bias. Nonetheless, when combined with the sample variance of the spatial field under consideration, they yield an estimate of the semivariogram that is unbiased for small lag distances. This justifies the use of this estimation technique in geostatistical applications. Various formulations of geostatistical combination (Kriging) methods are used here for the construction of hourly precipitation grids for Switzerland based on data from a sparse realtime network of raingauges and from a spatially complete radar composite. Two variants of Ordinary Kriging (OK) are used to interpolate the sparse gauge observations. In both OK variants, the radar data are only used to determine the semivariogram model. One variant relies on a traditional parametric semivariogram estimate, whereas the other variant uses the nonparametric correlogram. The variants are tested for three cases and the impact of the semivariogram model on the Kriging prediction is illustrated. For the three test cases, the method using nonparametric correlograms performs equally well or better than the traditional method, and at the same time offers great practical advantages. Furthermore, two variants of Kriging with external drift (KED) are tested, both of which use the radar data to estimate nonparametric correlograms, and as the external drift variable. The first KED variant has been used previously for geostatistical radar-raingauge merging in Catalonia (Spain). The second variant is newly proposed here and is an extension of the first. Both variants are evaluated for the three test cases as well as an extended evaluation period. It is found that both methods yield merged fields of better quality than the original radar field or fields obtained by OK of gauge data. The newly suggested KED formulation is shown to be beneficial, in particular in mountainous regions where the quality of the Swiss radar composite is comparatively low. An analysis of the Kriging variances shows that none of the methods tested here provides a satisfactory uncertainty estimate. A suitable variable transformation is expected to improve this.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The fundamental principles of the teaching methodology followed for dyslexic learners evolve around the need for a multisensory approach, which would advocate repetition of learning tasks in an enjoyable way. The introduction of multimedia technologies in the field of education has supported the merging of new tools (digital camera, scanner) and techniques (sounds, graphics, animation) in a meaningful whole. Dyslexic learners are now given the opportunity to express their ideas using these alternative media and participate actively in the educational process. This paper discussed the preliminary findings of a single case study of two English monolingual dyslexic children working together to create an open-ended multimedia project on a laptop computer. The project aimed to examine whether and if the multimedia environment could enhance the dyslexic learners’ skills in composition. Analysis of the data has indicated that the technological facilities gave the children the opportunity to enhance the style and content of their work for a variety of audiences and to develop responsibilities connected to authorship.