882 resultados para Branch and bound algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proteomic analysis using electrospray liquid chromatography-mass spectrometry (ESI-LC-MS) has been used to compare the sites of glycation (Amadori adduct formation) and carboxymethylation of RNase and to assess the role of the Amadori adduct in the formation of the advanced glycation end-product (AGE), N-is an element of-(carboxymethyl)lysine (CIVIL). RNase (13.7 mg/mL, 1 mM) was incubated with glucose (0.4 M) at 37 degreesC for 14 days in phosphate buffer (0.2 M, pH 7.4) under air. On the basis of ESI-LC-MS of tryptic peptides, the major sites of glycation of RNase were, in order, K41, K7, K1, and K37. Three of these, in order, K41, K7, and K37 were also the major sites of CIVIL formation. In other experiments, RNase was incubated under anaerobic conditions (1 mM DTPA, N-2 purged) to form Amadori-modified protein, which was then incubated under aerobic conditions to allow AGE formation. Again, the major sites of glycation were, in order, K41, K7, K1, and K37 and the major sites of carboxymethylation were K41, K7, and K37. RNase was also incubated with 1-5 mM glyoxal, substantially more than is formed by autoxidation of glucose under experimental conditions, but there was only trace modification of lysine residues, primarily at K41. We conclude the following: (1) that the primary route to formation of CIVIL is by autoxidation of Amadori adducts on protein, rather than by glyoxal generated on autoxidation of glucose; and (2) that carboxymethylation, like glycation, is a site-specific modification of protein affected by neighboring amino acids and bound ligands, such as phosphate or phosphorylated compounds. Even when the overall extent of protein modification is low, localization of a high proportion of the modifications at a few reactive sites might have important implications for understanding losses in protein functionality in aging and diabetes and also for the design of AGE inhibitors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discussions on banking reforms to reduce financial exclusion have referred little to possible attitudinal constraints, on the part of staff at both branch and institutional levels, inhibiting the provision of financial services to the poor. The research project, funded by the ESCOR (now Social Science Research) Small Grants Committee, has focused on this aspect of financial exclusion. The research commenced in May 2001 and was completed in April 2002. Profiles of the rural bank branch managers, including personal background, professional background and workplace, are presented. Attitudes of managers toward aspects of their work environment and the rural poor are examined, using results from both quantitative and qualitative analysis. Finally, the emerging policy implications are discussed. These include bank reforms to address human resource management, the work environment, intermediate bank management and organization, and the client interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the latest advances in the area of advanced computer architectures we are seeing already large scale machines at petascale level and we are discussing exascale computing. All these require efficient scalable algorithms in order to bridge the performance gap. In this paper examples of various approaches of designing scalable algorithms for such advanced architectures will be given and the corresponding properties of these algorithms will be outlined and discussed. Examples will outline such scalable algorithms applied to large scale problems in the area Computational Biology, Environmental Modelling etc. The key properties of such advanced and scalable algorithms will be outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two so-called “integrated” polarimetric rate estimation techniques, ZPHI (Testud et al., 2000) and ZZDR (Illingworth and Thompson, 2005), are evaluated using 12 episodes of the year 2005 observed by the French C-band operational Trappes radar, located near Paris. The term “integrated” means that the concentration parameter of the drop size distribution is assumed to be constant over some area and the algorithms retrieve it using the polarimetric variables in that area. The evaluation is carried out in ideal conditions (no partial beam blocking, no ground-clutter contamination, no bright band contamination, a posteriori calibration of the radar variables ZH and ZDR) using hourly rain gauges located at distances less than 60 km from the radar. Also included in the comparison, for the sake of benchmarking, is a conventional Z = 282R1.66 estimator, with and without attenuation correction and with and without adjustment by rain gauges as currently done operationally at Météo France. Under those ideal conditions, the two polarimetric algorithms, which rely solely on radar data, appear to perform as well if not better, pending on the measurements conditions (attenuation, rain rates, …), than the conventional algorithms, even when the latter take into account rain gauges through the adjustment scheme. ZZDR with attenuation correction is the best estimator for hourly rain gauge accumulations lower than 5 mm h−1 and ZPHI is the best one above that threshold. A perturbation analysis has been conducted to assess the sensitivity of the various estimators with respect to biases on ZH and ZDR, taking into account the typical accuracy and stability that can be reasonably achieved with modern operational radars these days (1 dB on ZH and 0.2 dB on ZDR). A +1 dB positive bias on ZH (radar too hot) results in a +14% overestimation of the rain rate with the conventional estimator used in this study (Z = 282R^1.66), a -19% underestimation with ZPHI and a +23% overestimation with ZZDR. Additionally, a +0.2 dB positive bias on ZDR results in a typical rain rate under- estimation of 15% by ZZDR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new database of weather and circulation type catalogs is presented comprising 17 automated classification methods and five subjective classifications. It was compiled within COST Action 733 "Harmonisation and Applications of Weather Type Classifications for European regions" in order to evaluate different methods for weather and circulation type classification. This paper gives a technical description of the included methods using a new conceptual categorization for classification methods reflecting the strategy for the definition of types. Methods using predefined types include manual and threshold based classifications while methods producing types derived from the input data include those based on eigenvector techniques, leader algorithms and optimization algorithms. In order to allow direct comparisons between the methods, the circulation input data and the methods' configuration were harmonized for producing a subset of standard catalogs of the automated methods. The harmonization includes the data source, the climatic parameters used, the classification period as well as the spatial domain and the number of types. Frequency based characteristics of the resulting catalogs are presented, including variation of class sizes, persistence, seasonal and inter-annual variability as well as trends of the annual frequency time series. The methodological concept of the classifications is partly reflected by these properties of the resulting catalogs. It is shown that the types of subjective classifications compared to automated methods show higher persistence, inter-annual variation and long-term trends. Among the automated classifications optimization methods show a tendency for longer persistence and higher seasonal variation. However, it is also concluded that the distance metric used and the data preprocessing play at least an equally important role for the properties of the resulting classification compared to the algorithm used for type definition and assignment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bran is hygroscopic and competes actively for water with other key components in baked cereal products like starch and gluten. Thermogravimetric analysis (TGA) of flour–water mixtures enriched with bran at different incorporation levels was performed to characterise the release of compartmentalised water. TGA investigations showed that the presence of bran increased compartmentalised water, with the measurement of an increase of total water loss from 58.30 ± 1.93% for flour only systems to 71.80 ± 0.37% in formulations comprising 25% w/w bran. Deconvolution of TGA profiles showed an alteration of the distribution of free and bound water, and its interaction with starch and gluten, within the formulations. TGA profiles showed that water release from bran-enriched flour is a prolonged event with respect to the release from non-enriched flour, which suggests the possibility that bran may interrupt the normal characteristic processes of texture formation that occur in non-enriched products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With President Truman’s ‘Campaign of Truth’ in the Fifties, Voice of America (VOA) established itself as one of the most important information programmes of the US government. The 20 million dollar budget allocated to VOA in those years enabled it to employ about 1,900 people and to broadcast in 45 different languages. Italy, with its strong and threatening Communist Party, was one of VOA’s main targets. Audience research however (performed by the United States Information Agency’s Italian branch and by the Italian opinion poll company Doxa) shows that the Italians always preferred their own national network RAI. The US government therefore started to target the RAI, with the aim of placing VOA-produced programmes directly on the Italian network in order to reach a mass audience. This article looks into what went on both ‘on’ and ‘off the air’, analyzing how various Italian ‘target groups’ were addressed by VOA. Drawing on documents from the National Archives and Records Administration in both Washington DC and New York City, and from the Doxa archives in Milan, the study examines how the American government prepared itself to conquer the Italian network RAI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article looks at the controversial music genre Oi! in relation to youth cultural identity in late 1970s’ and early 1980s’ Britain. As a form of British punk associated with skinheads, Oi! has oft-been dismissed as racist and bound up in the politics of the far right. It is argued here, however, that such a reading is too simplistic and ignores the more complex politics contained both within Oi! and the various youth cultural currents that revolved around the term ‘punk’ at this time. Taking as its starting point the Centre for Contemporary Cultural Studies’ conception of youth culture as a site of potential ‘resistance’, the article explores the substance and motifs of Oi!’s protest to locate its actual and perceived meaning within a far wider political and socio-economic context. More broadly, it seeks to demonstrate the value of historians examining youth culture as a formative and contested socio-cultural space within which young people discover, comprehend, and express their desires, opinions, and disaffections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the concluding paper of this tetralogy, we here use the different geomagnetic activity indices to reconstruct the near-Earth interplanetary magnetic field (IMF) and solar wind flow speed, as well as the open solar flux (OSF) from 1845 to the present day. The differences in how the various indices vary with near-Earth interplanetary parameters, which are here exploited to separate the effects of the IMF and solar wind speed, are shown to be statistically significant at the 93% level or above. Reconstructions are made using four combinations of different indices, compiled using different data and different algorithms, and the results are almost identical for all parameters. The correction to the aa index required is discussed by comparison with the Ap index from a more extensive network of mid-latitude stations. Data from the Helsinki magnetometer station is used to extend the aa index back to 1845 and the results confirmed by comparison with the nearby St Petersburg observatory. The optimum variations, using all available long-term geomagnetic indices, of the near-Earth IMF and solar wind speed, and of the open solar flux, are presented; all with ±2sigma� uncertainties computed using the Monte Carlo technique outlined in the earlier papers. The open solar flux variation derived is shown to be very similar indeed to that obtained using the method of Lockwood et al. (1999).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Satellite data are increasingly used to provide observation-based estimates of the effects of aerosols on climate. The Aerosol-cci project, part of the European Space Agency's Climate Change Initiative (CCI), was designed to provide essential climate variables for aerosols from satellite data. Eight algorithms, developed for the retrieval of aerosol properties using data from AATSR (4), MERIS (3) and POLDER, were evaluated to determine their suitability for climate studies. The primary result from each of these algorithms is the aerosol optical depth (AOD) at several wavelengths, together with the Ångström exponent (AE) which describes the spectral variation of the AOD for a given wavelength pair. Other aerosol parameters which are possibly retrieved from satellite observations are not considered in this paper. The AOD and AE (AE only for Level 2) were evaluated against independent collocated observations from the ground-based AERONET sun photometer network and against “reference” satellite data provided by MODIS and MISR. Tools used for the evaluation were developed for daily products as produced by the retrieval with a spatial resolution of 10 × 10 km2 (Level 2) and daily or monthly aggregates (Level 3). These tools include statistics for L2 and L3 products compared with AERONET, as well as scoring based on spatial and temporal correlations. In this paper we describe their use in a round robin (RR) evaluation of four months of data, one month for each season in 2008. The amount of data was restricted to only four months because of the large effort made to improve the algorithms, and to evaluate the improvement and current status, before larger data sets will be processed. Evaluation criteria are discussed. Results presented show the current status of the European aerosol algorithms in comparison to both AERONET and MODIS and MISR data. The comparison leads to a preliminary conclusion that the scores are similar, including those for the references, but the coverage of AATSR needs to be enhanced and further improvements are possible for most algorithms. None of the algorithms, including the references, outperforms all others everywhere. AATSR data can be used for the retrieval of AOD and AE over land and ocean. PARASOL and one of the MERIS algorithms have been evaluated over ocean only and both algorithms provide good results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The states of an electron confined in a two-dimensional (2D) plane and bound to an off-plane donor impurity center, in the presence of a magnetic field, are investigated. The energy levels of the ground state and the first three excited states are calculated variationally. The binding energy and the mean orbital radius of these states are obtained as a function of the donor center position and the magnetic field strength. The limiting cases are discussed for an in-plane donor impurity (i.e. a 2D hydrogen atom) as well as for the donor center far away from the 2D plane in strong magnetic fields, which corresponds to a 2D harmonic oscillator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last years the number of industrial applications for Augmented Reality (AR) and Virtual Reality (VR) environments has significantly increased. Optical tracking systems are an important component of AR/VR environments. In this work, a low cost optical tracking system with adequate attributes for professional use is proposed. The system works in infrared spectral region to reduce optical noise. A highspeed camera, equipped with daylight blocking filter and infrared flash strobes, transfers uncompressed grayscale images to a regular PC, where image pre-processing software and the PTrack tracking algorithm recognize a set of retro-reflective markers and extract its 3D position and orientation. Included in this work is a comprehensive research on image pre-processing and tracking algorithms. A testbed was built to perform accuracy and precision tests. Results show that the system reaches accuracy and precision levels slightly worse than but still comparable to professional systems. Due to its modularity, the system can be expanded by using several one-camera tracking modules linked by a sensor fusion algorithm, in order to obtain a larger working range. A setup with two modules was built and tested, resulting in performance similar to the stand-alone configuration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growth of maize (Zea mays L.) kernels depends on the availability of carbon (C) and nitrogen (N) assimilates supplied by the mother plant and the capacity of the kernel to use them. Our objectives were to study the effects of N and sucrose supply levels on growth and metabolism of maize kernels. Kernel explants of Pioneer 34RO6 were cultured in vitro with varying combinations of N (5 to 30 mM) and sucrose (117 to 467 mM). Maximum kernel growth was obtained with 10 mM N and 292 mM sucrose in the medium, and a deficiency of one assimilate could not be overcome by a sufficiency of the other. Increasing the N supply led to increases in the kernel sink capacity (number of cells and starch granules in the endosperm), activity of certain enzymes (soluble and bound invertases, sucrose synthase, and aspartate aminotransaminase), starch, and the levels of N compounds (total-N, soluble protein, and free amino acids), and decreased the levels of C metabolites (sucrose and reducing sugars). Conversely, increasing the sucrose supply increased the level of endosperm C metabolites, free amino acids, and ADPG-PPase and alanine transaminase activities, but decreased the activity of soluble invertase and concentrations of soluble protein and total-N. Thus, while C and N are interdependent and essential for accumulation of maximum kernel weight, they appear to regulate growth by different means. Nitrogen supply aids the establishment of kernel sink capacity, and promotes activity of enzymes relating to sucrose and nitrogen uptake, while sucrose regulates the activities df invertase and ADPG-PPase. (C) 1999 Annals of Botany Company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, a performance analysis of transmission schemes employing turbo trellis coded modulation. In general, the performance analysis of such schemes is guided by evaluating the error probability of these schemes. The exact evaluation of this probability is very complex and inefficient from the computational point of view, a widely used alternative is the use of union bound of error probability, because of its easy implementation and computational produce bounds that converge quickly. Since it is the union bound, it should use to expurge some elements of distance spectrum to obtain a tight bound. The main contribution of this work is that the listing proposal is carried out from the puncturing at the level of symbol rather than bit-level as in most works of literature. The main reason for using the symbol level puncturing lies in the fact that the enummerating function of the turbo scheme is obtained directly from complex sequences of signals through the trellis and not indirectly from the binary sequences that require further binary to complex mapping, as proposed by previous works. Thus, algorithms can be applied through matrix from the adjacency matrix, which is obtained by calculating the distances of the complex sequences of the trellis. This work also presents two matrix algorithms for state reduction and the evaluation of the transfer function of this. The results presented in comparisons of the bounds obtained using the proposed technique with some turbo codes of the literature corroborate the proposition of this paper that the expurgated bounds obtained are quite tight and matrix algorithms are easily implemented in any programming software language