975 resultados para Global Processing Speed


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermal stability of nanograined metals can be difficult to attain due to the large driving force for grain growth that arises from the significant boundary area constituted by the nanostructure. Kinetic approaches for stabilization of the nanostructure effective at low homologous temperatures often fail at higher homologous temperatures. Thermodynamic approaches for thermal stabilization may offer higher temperature stability. In this research, modest alloying of aluminum with solute (1 at.% Sc, Yb, or Sr) was examined as a means to thermodynamically stabilize a bulk nanostructure at elevated temperatures. After using melt-spinning and ball-milling to create an extended solid-solution and nanostructure with average grain size on the order of 30-45 nm, 1 h annealing treatments at 673 K (0.72 Tm) , 773 K (0.83 Tm) , and 873 K (0.94 Tm) were applied. The alloys remain nanocrystalline (<100 nm) as measured by Warren-Averbach Fourier analysis of x-ray diffraction peaks and direct observation of TEM dark field micrographs, with the efficacy of stabilization: Sr>Yb>Sc. Disappearance of intermetallic phases in the Sr and Yb alloys in the x-ray diffraction spectra are observed to occur coincident with the stabilization after annealing, suggesting that precipitates dissolve and the boundaries are enriched with solute. Melt-spinning has also been shown to be an effective process to produce a class of ordered, but non-periodic crystals called quasicrystals. However, many of the factors related to the creation of the quasicrystals through melt-spinning are not optimized for specific chemistries and alloy systems. In a related but separate aspect of this research, meltspinning was utilized to create metastable quasicrystalline Al6Mn in an α-Al matrix through rapid solidification of Al-8Mn (by mol) and Al-10Mn (by mol) alloys. Wheel speed of the melt-spinning wheel and orifice diameter of the tube reservoir were varied to determine their effect on the resulting volume proportions of the resultant phases using integrated areas of collected x-ray diffraction spectra. The data were then used to extrapolate parameters for the Al-10Mn alloy which consistently produced Al6Mn quasicrystal with almost complete suppression of the equilibrium Al6Mn orthorhombic phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bronchial epithelial cells play a pivotal role in airway inflammation, but little is known about posttranscriptional regulation of mediator gene expression during the inflammatory response in these cells. Here, we show that activation of human bronchial epithelial BEAS-2B cells by proinflammatory cytokines interleukin-4 (IL-4) and tumor necrosis factor alpha (TNF-alpha) leads to an increase in the mRNA stability of the key chemokines monocyte chemotactic protein 1 and IL-8, an elevation of the global translation rate, an increase in the levels of several proteins critical for translation, and a reduction of microRNA-mediated translational repression. Moreover, using the BEAS-2B cell system and a mouse model, we found that RNA processing bodies (P bodies), cytoplasmic domains linked to storage and/or degradation of translationally silenced mRNAs, are significantly reduced in activated bronchial epithelial cells, suggesting a physiological role for P bodies in airway inflammation. Our study reveals an orchestrated change among posttranscriptional mechanisms, which help sustain high levels of inflammatory mediator production in bronchial epithelium during the pathogenesis of inflammatory airway diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, dramatic changes are happening in the IS development industry. The incumbent system developers (hubs) are embracing partnerships with less well established companies (spokes), acting in specific niches. This paper seeks to establish a better understanding of the motives for this strategy. Relying on existing work on strategic alliance formation, it is argued that partnering is particularly attractive, if these small companies possess certain capabilities that are difficult to obtain through other arrangements than partnering. Again drawing on the literature, three categories of capabilities are identified: the capability to innovate within their niche, the capability to provide a specific functionality that can be integrated with the incumbents’ systems, and the capability to address novel markets. These factors are analyzed through a case study. The case represents a market leader in the global IS development industry, which fosters a network of smaller partner firms. The study reveals that temporal dynamics between the identified factors are playing a dominant role in these networks. A cyclical partnership model is developed that attempts to explain the life cycle of a partnership within such a network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many observed time series of the global radiosonde or PILOT networks exist as fragments distributed over different archives. Identifying and merging these fragments can enhance their value for studies on the three-dimensional spatial structure of climate change. The Comprehensive Historical Upper-Air Network (CHUAN version 1.7), which was substantially extended in 2013, and the Integrated Global Radiosonde Archive (IGRA) are the most important collections of upper-air measurements taken before 1958. CHUAN (tracked) balloon data start in 1900, with higher numbers from the late 1920s onward, whereas IGRA data start in 1937. However, a substantial fraction of those measurements have not been taken at synoptic times (preferably 00:00 or 12:00 GMT) and on altitude levels instead of standard pressure levels. To make them comparable with more recent data, the records have been brought to synoptic times and standard pressure levels using state-of-the-art interpolation techniques, employing geopotential information from the National Oceanic and Atmospheric Administration (NOAA) 20th Century Reanalysis (NOAA 20CR). From 1958 onward the European Re-Analysis archives (ERA-40 and ERA-Interim) available at the European Centre for Medium-Range Weather Forecasts (ECMWF) are the main data sources. These are easier to use, but pilot data still have to be interpolated to standard pressure levels. Fractions of the same records distributed over different archives have been merged, if necessary, taking care that the data remain traceable back to their original sources. If possible, station IDs assigned by the World Meteorological Organization (WMO) have been allocated to the station records. For some records which have never been identified by a WMO ID, a local ID above 100 000 has been assigned. The merged data set contains 37 wind records longer than 70 years and 139 temperature records longer than 60 years. It can be seen as a useful basis for further data processing steps, most notably homogenization and gridding, after which it should be a valuable resource for climatological studies. Homogeneity adjustments for wind using the NOAA-20CR as a reference are described in Ramella Pralungo and Haimberger (2014). Reliable homogeneity adjustments for temperature beyond 1958 using a surface-data-only reanalysis such as NOAA-20CR as a reference have yet to be created. All the archives and metadata files are available in ASCII and netCDF format in the PANGAEA archive

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present study we introduce a novel task for the quantitative assessment of both originality and speed of individual associations. This 'BAG' (Bridge-the-Associative-Gap) task was used to investigate the relationships between creativity and paranormal belief. Twelve strong 'believers' and 12 strong 'skeptics' in paranormal phenomena were selected from a large student population (n > 350). Subjects were asked to produce single-word associations to word pairs. In 40 trials the two stimulus words were semantically indirectly related and in 40 other trials the words were semantically unrelated. Separately for these two stimulus types, response commonalities and association latencies were calculated. The main finding was that for unrelated stimuli, believers produced associations that were more original (had a lower frequency of occurrence in the group as a whole) than those of the skeptics. For the interpretation of the result we propose a model of association behavior that captures both 'positive' psychological aspects (i.e., verbal creativity) and 'negative' aspects (susceptibility to unfounded inferences), and outline its relevance for psychiatry. This model suggests that believers adopt a looser response criterion than skeptics when confronted with 'semantic noise'. Such a signal detection view of the presence/absence of judgments for loose semantic relations may help to elucidate the commonalities between creative thinking, paranormal belief and delusional ideation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Inability to predict the therapeutic effect of a drug in individual pain patients prolongs the process of drug and dose finding until satisfactory pharmacotherapy can be achieved. Many chronic pain conditions are associated with hypersensitivity of the nervous system or impaired endogenous pain modulation. Pharmacotherapy often aims at influencing these disturbed nociceptive processes. Its effect might therefore depend on the extent to which they are altered. Quantitative sensory testing (QST) can evaluate various aspects of pain processing and might therefore be able to predict the analgesic efficacy of a given drug. In the present study three drugs commonly used in the pharmacological management of chronic low back pain are investigated. The primary objective is to examine the ability of QST to predict pain reduction. As a secondary objective, the analgesic effects of these drugs and their effect on QST are evaluated. METHODS/DESIGN In this randomized, double blinded, placebo controlled cross-over study, patients with chronic low back pain are randomly assigned to imipramine, oxycodone or clobazam versus active placebo. QST is assessed at baseline, 1 and 2 h after drug administration. Pain intensity, side effects and patients' global impression of change are assessed in intervals of 30 min up to two hours after drug intake. Baseline QST is used as explanatory variable to predict drug effect. The change in QST over time is analyzed to describe the pharmacodynamic effects of each drug on experimental pain modalities. Genetic polymorphisms are analyzed as co-variables. DISCUSSION Pharmacotherapy is a mainstay in chronic pain treatment. Antidepressants, anticonvulsants and opioids are frequently prescribed in a "trial and error" fashion, without knowledge however, which drug suits best which patient. The present study addresses the important need to translate recent advances in pain research to clinical practice. Assessing the predictive value of central hypersensitivity and endogenous pain modulation could allow for the implementation of a mechanism-based treatment strategy in individual patients. TRIAL REGISTRATION Clinicaltrials.gov, NCT01179828.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We calculate the anomalous dimensions of operators with large global charge J in certain strongly coupled conformal field theories in three dimensions, such as the O(2) model and the supersymmetric fixed point with a single chiral superfield and a W = Φ3 superpotential. Working in a 1/J expansion, we find that the large-J sector of both examples is controlled by a conformally invariant effective Lagrangian for a Goldstone boson of the global symmetry. For both these theories, we find that the lowest state with charge J is always a scalar operator whose dimension ΔJ satisfies the sum rule J2ΔJ−(J22+J4+316)ΔJ−1−(J22+J4+316)ΔJ+1=0.04067 up to corrections that vanish at large J . The spectrum of low-lying excited states is also calculable explcitly: for example, the second-lowest primary operator has spin two and dimension ΔJ+3√. In the supersymmetric case, the dimensions of all half-integer-spin operators lie above the dimensions of the integer-spin operators by a gap of order J+12. The propagation speeds of the Goldstone waves and heavy fermions are 12√ and ±12 times the speed of light, respectively. These values, including the negative one, are necessary for the consistent realization of the superconformal symmetry at large J.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty information for global leaf area index (LAI) products is important for global modeling studies but usually difficult to systematically obtain at a global scale. Here, we present a new method that cross-validates existing global LAI products and produces consistent uncertainty information. The method is based on a triple collocation error model (TCEM) that assumes errors among LAI products are not correlated. Global monthly absolute and relative uncertainties, in 0.05° spatial resolutions, were generated for MODIS, CYCLOPES, and GLOBCARBON LAI products, with reasonable agreement in terms of spatial patterns and biome types. CYCLOPES shows the lowest absolute and relative uncertainties, followed by GLOBCARBON and MODIS. Grasses, crops, shrubs, and savannas usually have lower uncertainties than forests in association with the relatively larger forest LAI. With their densely vegetated canopies, tropical regions exhibit the highest absolute uncertainties but the lowest relative uncertainties, the latter of which tend to increase with higher latitudes. The estimated uncertainties of CYCLOPES generally meet the quality requirements (± 0.5) proposed by the Global Climate Observing System (GCOS), whereas for MODIS and GLOBCARBON only non-forest biome types have met the requirement. Nevertheless, none of the products seems to be within a relative uncertainty requirements of 20%. Further independent validation and comparative studies are expected to provide a fair assessment of uncertainties derived from TCEM. Overall, the proposed TCEM is straightforward and could be automated for the systematic processing of real time remote sensing observations to provide theoretical uncertainty information for a wider range of land products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At present time, there is a lack of knowledge on the interannual climate-related variability of zooplankton communities of the tropical Atlantic, central Mediterranean Sea, Caspian Sea, and Aral Sea, due to the absence of appropriate databases. In the mid latitudes, the North Atlantic Oscillation (NAO) is the dominant mode of atmospheric fluctuations over eastern North America, the northern Atlantic Ocean and Europe. Therefore, one of the issues that need to be addressed through data synthesis is the evaluation of interannual patterns in species abundance and species diversity over these regions in regard to the NAO. The database has been used to investigate the ecological role of the NAO in interannual variations of mesozooplankton abundance and biomass along the zonal array of the NAO influence. Basic approach to the proposed research involved: (1) development of co-operation between experts and data holders in Ukraine, Russia, Kazakhstan, Azerbaijan, UK, and USA to rescue and compile the oceanographic data sets and release them on CD-ROM, (2) organization and compilation of a database based on FSU cruises to the above regions, (3) analysis of the basin-scale interannual variability of the zooplankton species abundance, biomass, and species diversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supply chain management works to bring the supplier, the distributor, and the customer into one cohesive process. The Supply Chain Council defined supply chain as ‘Supply Chain: The flow and transformation of raw materials into products from suppliers through production and distribution facilities to the ultimate consumer., and then Sunil Chopra and Meindl, (2001) have define Supply chain management as ‘Supply Chain Management involves the flows between and among stages in a supply chain to maximize total profitability.’ After 1950, supply chain management got a boost with the production and manufacturing sector getting highest attention. The inventory became the responsibility of the marketing, accounting and production areas. Order processing was part of accounting and sales. Supply chain management became one of the most powerful engines of business transformation. It is the one area where operational efficiency can be gained. It reduces organizations costs and enhances customer service. With the liberalization of world trade, globalization, and emergence of the new markets, many organizations have customers and competitions throughout the world, either directly or indirectly. Business communities are aware that global competitiveness is the key to the success of a business. Competitiveness is ability to produce, distribute and provide products and services for the open market in competition with others. The supply chain, a critical link between supplier, producer and customer is emerged now as an essential business process and a strategic lever, potential value contributor a differentiator for the success of any business. Supply chain management is the management of all internal and external processes or functions to satisfy a customer’s order (from raw materials through conversion and manufacture through logistics delivery.). Goods-either in raw form or processed, whole sale or retailed distribution, business or technology services, in everyday life- in the business or household- directly or indirectly supply chain is ubiquitously associated in expanding socio-economic development. Supply chain growth competitive performance and supporting strong growth impulse at micro as well as micro economic levels. Keeping the India vision at the core of the objective, the role of supply chain is to take up social economic challenges, improve competitive advantages, develop strategies, built capabilities, enhance value propositions, adapt right technology, collaborate with stakeholders and deliver environmentally sustainable outcomes with minimum resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines the problems found in the parallelization of SPH (Smoothed Particle Hydrodynamics) algorithms using Graphics Processing Units. Different results of some parallel GPU implementations in terms of the speed-up and the scalability compared to the CPU sequential codes are shown. The most problematic stage in the GPU-SPH algorithms is the one responsible for locating neighboring particles and building the vectors where this information is stored, since these specific algorithms raise many dificulties for a data-level parallelization. Because of the fact that the neighbor location using linked lists does not show enough data-level parallelism, two new approaches have been pro- posed to minimize bank conflicts in the writing and subsequent reading of the neighbor lists. The first strategy proposes an efficient coordination between CPU-GPU, using GPU algorithms for those stages that allow a straight forward parallelization, and sequential CPU algorithms for those instructions that involve some kind of vector reduction. This coordination provides a relatively orderly reading of the neighbor lists in the interactions stage, achieving a speed-up factor of x47 in this stage. However, since the construction of the neighbor lists is quite expensive, it is achieved an overall speed-up of x41. The second strategy seeks to maximize the use of the GPU in the neighbor's location process by executing a specific vector sorting algorithm that allows some data-level parallelism. Al- though this strategy has succeeded in improving the speed-up on the stage of neighboring location, the global speed-up on the interactions stage falls, due to inefficient reading of the neighbor vectors. Some changes to these strategies are proposed, aimed at maximizing the computational load of the GPU and using the GPU texture-units, in order to reach the maximum speed-up for such codes. Different practical applications have been added to the mentioned GPU codes. First, the classical dam-break problem is studied. Second, the wave impact of the sloshing fluid contained in LNG vessel tanks is also simulated as a practical example of particle methods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a scalable software architecture for on-line multi-camera video processing, that guarantees a good trade off between computational power, scalability and flexibility. The software system is modular and its main blocks are the Processing Units (PUs), and the Central Unit. The Central Unit works as a supervisor of the running PUs and each PU manages the acquisition phase and the processing phase. Furthermore, an approach to easily parallelize the desired processing application has been presented. In this paper, as case study, we apply the proposed software architecture to a multi-camera system in order to efficiently manage multiple 2D object detection modules in a real-time scenario. System performance has been evaluated under different load conditions such as number of cameras and image sizes. The results show that the software architecture scales well with the number of camera and can easily works with different image formats respecting the real time constraints. Moreover, the parallelization approach can be used in order to speed up the processing tasks with a low level of overhead

Relevância:

30.00% 30.00%

Publicador:

Resumo:

streets in local residential areas in large cities, real traffic tests for pollutant emissions and fuel consumption have been carried out in Madrid city centre. Emission concentration and car activity were simultaneously measured by a Portable Emissions Measurement System. Real life tests carried out at different times and on different days were performed with a turbo-diesel engine light vehicle equipped with an oxidizer catalyst and using different driving styles with a previously trained driver. The results show that by reducing the speed limit from 50 km h-1 to 30 km h-1, using a normal driving style, the time taken for a given trip does not increase, but fuel consumption and NOx, CO and PM emissions are clearly reduced. Therefore, the main conclusion of this work is that reducing the speed limit in some narrow streets in residential and commercial areas or in a city not only increases pedestrian safety, but also contributes to reducing the environmental impact of motor vehicles and reducing fuel consumption. In addition, there is also a reduction in the greenhouse gas emissions resulting from the combustion of the fuel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the issue of the practicality of global flow analysis in logic program compilation, in terms of speed of the analysis, precisión, and usefulness of the information obtained. To this end, design and implementation aspects are discussed for two practical abstract interpretation-based flow analysis systems: MA , the MCC And-parallel Analyzer and Annotator; and Ms, an experimental mode inference system developed for SB-Prolog. The paper also provides performance data obtained (rom these implementations and, as an example of an application, a study of the usefulness of the mode information obtained in reducing run-time checks in independent and-parallelism.Based on the results obtained, it is concluded that the overhead of global flow analysis is not prohibitive, while the results of analysis can be quite precise and useful.