978 resultados para multi-purpose trips
The TAMORA algorithm: satellite rainfall estimates over West Africa using multi-spectral SEVIRI data
Resumo:
A multi-spectral rainfall estimation algorithm has been developed for the Sahel region of West Africa with the purpose of producing accumulated rainfall estimates for drought monitoring and food security. Radar data were used to calibrate multi-channel SEVIRI data from MSG, and a probability of rainfall at several different rain-rates was established for each combination of SEVIRI radiances. Radar calibrations from both Europe (the SatPrecip algorithm) and Niger (TAMORA algorithm) were used. 10 day estimates were accumulated from SatPrecip and TAMORA and compared with kriged gauge data and TAMSAT satellite rainfall estimates over West Africa. SatPrecip was found to produce large overestimates for the region, probably because of its non-local calibration. TAMORA was negatively biased for areas of West Africa with relatively high rainfall, but its skill was comparable to TAMSAT for the low-rainfall region climatologically similar to its calibration area around Niamey. These results confirm the high importance of local calibration for satellite-derived rainfall estimates. As TAMORA shows no improvement in skill over TAMSAT for dekadal estimates, the extra cloud-microphysical information provided by multi-spectral data may not be useful in determining rainfall accumulations at a ten day timescale. Work is ongoing to determine whether it shows improved accuracy at shorter timescales.
Resumo:
PURPOSE: Multi-species probiotic preparations have been suggested as having a wide spectrum of application, although few studies have compared their efficacy with that of individual component strains at equal concentrations. We therefore tested the ability of 4 single probiotics and 4 probiotic mixtures to inhibit the urinary tract pathogens Escherichia coli NCTC 9001 and Enterococcus faecalis NCTC 00775. METHODS: We used an agar spot test to test the ability of viable cells to inhibit pathogens, while a broth inhibition assay was used to assess inhibition by cell-free probiotic supernatants in both pH-neutralised and non-neutralised forms. RESULTS: In the agar spot test, all probiotic treatments showed inhibition, L. acidophilus was the most inhibitory single strain against E. faecalis, L. fermentum the most inhibitory against E. coli. A commercially available mixture of 14 strains (Bio-Kult(®)) was the most effective mixture, against E. faecalis, the 3-lactobacillus mixture the most inhibitory against E. coli. Mixtures were not significantly more inhibitory than single strains. In the broth inhibition assays, all probiotic supernatants inhibited both pathogens when pH was not controlled, with only 2 treatments causing inhibition at a neutral pH. CONCLUSIONS: Both viable cells of probiotics and supernatants of probiotic cultures were able to inhibit growth of two urinary tract pathogens. Probiotic mixtures prevented the growth of urinary tract pathogens but were not significantly more inhibitory than single strains. Probiotics appear to produce metabolites that are inhibitory towards urinary tract pathogens. Probiotics display potential to reduce the incidence of urinary tract infections via inhibition of colonisation.
Resumo:
Purpose – Multinationals have always needed an operating model that works – an effective plan for executing their most important activities at the right levels of their organization, whether globally, regionally or locally. The choices involved in these decisions have never been obvious, since international firms have consistently faced trade‐offs between tailoring approaches for diverse local markets and leveraging their global scale. This paper seeks a more in‐depth understanding of how successful firms manage the global‐local trade‐off in a multipolar world. Design methodology/approach – This paper utilizes a case study approach based on in‐depth senior executive interviews at several telecommunications companies including Tata Communications. The interviews probed the operating models of the companies we studied, focusing on their approaches to organization structure, management processes, management technologies (including information technology (IT)) and people/talent. Findings – Successful companies balance global‐local trade‐offs by taking a flexible and tailored approach toward their operating‐model decisions. The paper finds that successful companies, including Tata Communications, which is profiled in‐depth, are breaking up the global‐local conundrum into a set of more manageable strategic problems – what the authors call “pressure points” – which they identify by assessing their most important activities and capabilities and determining the global and local challenges associated with them. They then design a different operating model solution for each pressure point, and repeat this process as new strategic developments emerge. By doing so they not only enhance their agility, but they also continually calibrate that crucial balance between global efficiency and local responsiveness. Originality/value – This paper takes a unique approach to operating model design, finding that an operating model is better viewed as several distinct solutions to specific “pressure points” rather than a single and inflexible model that addresses all challenges equally. Now more than ever, developing the right operating model is at the top of multinational executives' priorities, and an area of increasing concern; the international business arena has changed drastically, requiring thoughtfulness and flexibility instead of standard formulas for operating internationally. Old adages like “think global and act local” no longer provide the universal guidance they once seemed to.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
For many years, drainage design was mainly about providing sufficient network capacity. This traditional approach had been successful with the aid of computer software and technical guidance. However, the drainage design criteria had been evolving due to rapid population growth, urbanisation, climate change and increasing sustainability awareness. Sustainable drainage systems that bring benefits in addition to water management have been recommended as better alternatives to conventional pipes and storages. Although the concepts and good practice guidance had already been communicated to decision makers and public for years, network capacity still remains a key design focus in many circumstances while the additional benefits are generally considered secondary only. Yet, the picture is changing. The industry begins to realise that delivering multiple benefits should be given the top priority while the drainage service can be considered a secondary benefit instead. The shift in focus means the industry has to adapt to new design challenges. New guidance and computer software are needed to assist decision makers. For this purpose, we developed a new decision support system. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. Users can systematically quantify the performance, life-cycle costs and benefits of different drainage systems using the evaluation framework. The optimisation tool can assist users to determine combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will focus on the optimisation component of the decision support framework. The optimisation problem formation, parameters and general configuration will be discussed. We will also look at the sensitivity of individual variables and the benchmark results obtained using common multi-objective optimisation algorithms. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.
Resumo:
Este trabalho discute a implementação da estratégia de postponement, um conceito que vem crescendo de importância nos últimos anos. Postponement é um conceito operacional que consiste em retardar a configuração final de produtos até que os pedidos dos consumidores sejam recebidos. Apesar da atratividade teórica e relevância do tema, pouco ainda se sabe sobre seu processo de implementação, especialmente no ambiente de negócio brasileiro. Este trabalho investigou em profundidade a implementação do postponement em cinco conceituadas empresas no Brasil procurando identificar os motivos que levaram seus respectivos executivos a adotarem tal estratégia, quais foram os agentes facilitadores e os obstáculos à implementação e, finalmente, até que ponto o postponement contribuiu para um aumento da competitividade.
Resumo:
O estudo busca investigar a atuação das grandes empresas varejista brasileiras com relação à Responsabilidade Social Empresarial (RSE), procurando levantar o estágio em que estas se encontram, e se são aproveitadas as características do varejo de capilaridade geográfica, contato direto com a comunidade, forte vinculo com os clientes, interação entre funcionários e clientes e proximidade física de Organizações Não Governamentais e instituições públicas. Para tanto, foram utilizados conceitos relacionados à gestão, como comprometimento da cúpula, incorporação de valores de RSE na administração e no planejamento estratégico, autonomia e gestão de RSE nas lojas. A gestão foi avaliada pela ótica do contínuo de colaboração de Austin, e as práticas de RSE por meio da teoria dos stakeholders (públicos interessados), utilizando como base as dimensões dos Indicadores Ethos de Responsabilidade Social. A teoria de Kotler e Lee, para a classificação das iniciativas conforme os conceitos do marketing, também foi empregada. Conduziu-se uma pesquisa exploratória com cinco grandes empresas do setor varejista. Os resultados encontrados apontam que na maioria das empresas a incorporação dessas práticas é recente. Constata-se que as empresas diferem quanto ao estágio de RSE e que as características próprias do varejo não são aproveitadas em sua totalidade. Hipóteses são levantadas para que estas aproveitem a estrutura das lojas para atender às necessidades da comunidade local.
Resumo:
This paper aims to analyze dual-purpose systems focusing the total cost optimization; a superstructure is proposed to present cogeneration systems and desalination technologies alternatives for the synthesis process. The superstructure consists of excluding components, gas turbines or conventional steam generators with excluding alternatives of supplying fuel for each combustion system. Also, backpressure or condensing/extraction steam turbine for supplying process steam could be selected. Finally one desalination unit chosen between electrically-driven or steam-driven reverse osmosis. multi-effect and multistage flash should be included. The analysis herein performed is based on energy and mass conservation equations, as well as the technological limiting equation of equipment. The results for ten different commercial gas turbines revealed that electrically-driven reverse osmosis was always chosen together with both natural gas and gasified biomass gas turbines. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The disadvantages generated by the acid etching of the dentin, such as an increase in its permeability, in the surface moisture and in the potential to denature the external dentinal collagen, the formation of a fragility zone and the citotoxicity of the adhesive monomers; which are all aggravated by the depth of the dentin, have stimulated new and different treatment philosophies of the dentin. The purpose of the present study, therefore, was to investigate the effects of three dentin treatments: laser irradiation, acid etching and hypermineralization, in the shear bond strength of the SMP Plus bonding system. Sixty bovine incisors were extracted and randomly selected immediatly alter the animal's death. They were kept frozen (-18°C) for no longer than 14 days. After buccal dentinal surface had been exposed, X-Rays were taken to control the dentin thickness. The specimens were separated into two groups: (1) Control, kept in distilled water at 4ºC; (2) Mineralized, kept in hypermineralized solution at 4°C for 14 days. Each group was divided into three sub-groups according to the type of dentin treatment used: group F - followed the manufacturer instructions (acid-etching + primer + bond), group AL (acid-etching + primer + bond + laser) and group LA (laser + (laser + acid-etching + primer + bond). A composite resin standard cylinder (Z100-3M) was bond to the dentinal surface and the shear bond strength performed on a Universal lnstron machine 4301, with 500 Kg load and at 0,5mm/min. speed. The analysis of variance (ANOVA) determined that the treatments influenced the shear bond strength values (p<0,05) with the following average shearing load at failure: AL (9,96 MPa), F (7,28MPa) e LA (4,87 MPa). The interaction between the two factors analyzed Group (control and mineralized) and treatment (F, AL, LA) also influenced the shear bond strength (p<0,05). The highest values were obtained...
Resumo:
It is well known that the deposition of gaseous pollutants and aerosols plays a major role in causing the deterioration of monuments and built cultural heritage in European cities. Despite of many studies dedicated to the environmental damage of cultural heritage, in case of cement mortars, commonly used in the 20th century architecture, the deterioration due to air multipollutants impact, especially the formation of black crusts, is still not well explored making this issue a challenging area of research. This work centers on cement mortars – environment interactions, focusing on the diagnosis of the damage on the modern built heritage due to air multi-pollutants. For this purpose three sites, exposed to different urban areas in Europe, were selected for sampling and subsequent laboratory analyses: Centennial Hall, Wroclaw (Poland), Chiesa dell'Autostrada del Sole, Florence (Italy), Casa Galleria Vichi, Florence (Italy). The sampling sessions were performed taking into account the height from the ground level and protection from rain run off (sheltered, partly sheltered and exposed areas). The complete characterization of collected damage layer and underlying materials was performed using a range of analytical techniques: optical and scanning electron microscopy, X ray diffractometry, differential and gravimetric thermal analysis, ion chromatography, flash combustion/gas chromatographic analysis, inductively coupled plasma-optical emission spectrometer. The data were elaborated using statistical methods (i.e. principal components analyses) and enrichment factor for cement mortars was calculated for the first time. The results obtained from the experimental activity performed on the damage layers indicate that gypsum, due to the deposition of atmospheric sulphur compounds, is the main damage product at surfaces sheltered from rain run-off at Centennial Hall and Casa Galleria Vichi. By contrast, gypsum has not been identified in the samples collected at Chiesa dell'Autostrada del Sole. This is connected to the restoration works, particularly surface cleaning, regularly performed for the maintenance of the building. Moreover, the results obtained demonstrated the correlation between the location of the building and the composition of the damage layer: Centennial Hall is mainly undergoing to the impact of pollutants emitted from the close coal power stations, whilst Casa Galleria Vichi is principally affected by pollutants from vehicular exhaust in front of the building.
Resumo:
This thesis deals with distributed control strategies for cooperative control of multi-robot systems. Specifically, distributed coordination strategies are presented for groups of mobile robots. The formation control problem is initially solved exploiting artificial potential fields. The purpose of the presented formation control algorithm is to drive a group of mobile robots to create a completely arbitrarily shaped formation. Robots are initially controlled to create a regular polygon formation. A bijective coordinate transformation is then exploited to extend the scope of this strategy, to obtain arbitrarily shaped formations. For this purpose, artificial potential fields are specifically designed, and robots are driven to follow their negative gradient. Artificial potential fields are then subsequently exploited to solve the coordinated path tracking problem, thus making the robots autonomously spread along predefined paths, and move along them in a coordinated way. Formation control problem is then solved exploiting a consensus based approach. Specifically, weighted graphs are used both to define the desired formation, and to implement collision avoidance. As expected for consensus based algorithms, this control strategy is experimentally shown to be robust to the presence of communication delays. The global connectivity maintenance issue is then considered. Specifically, an estimation procedure is introduced to allow each agent to compute its own estimate of the algebraic connectivity of the communication graph, in a distributed manner. This estimate is then exploited to develop a gradient based control strategy that ensures that the communication graph remains connected, as the system evolves. The proposed control strategy is developed initially for single-integrator kinematic agents, and is then extended to Lagrangian dynamical systems.
Resumo:
This thesis proposes an integrated holistic approach to the study of neuromuscular fatigue in order to encompass all the causes and all the consequences underlying the phenomenon. Starting from the metabolic processes occurring at the cellular level, the reader is guided toward the physiological changes at the motorneuron and motor unit level and from this to the more general biomechanical alterations. In Chapter 1 a list of the various definitions for fatigue spanning several contexts has been reported. In Chapter 2, the electrophysiological changes in terms of motor unit behavior and descending neural drive to the muscle have been studied extensively as well as the biomechanical adaptations induced. In Chapter 3 a study based on the observation of temporal features extracted from sEMG signals has been reported leading to the need of a more robust and reliable indicator during fatiguing tasks. Therefore, in Chapter 4, a novel bi-dimensional parameter is proposed. The study on sEMG-based indicators opened a scenario also on neurophysiological mechanisms underlying fatigue. For this purpose, in Chapter 5, a protocol designed for the analysis of motor unit-related parameters during prolonged fatiguing contractions is presented. In particular, two methodologies have been applied to multichannel sEMG recordings of isometric contractions of the Tibialis Anterior muscle: the state-of-the-art technique for sEMG decomposition and a coherence analysis on MU spike trains. The importance of a multi-scale approach has been finally highlighted in the context of the evaluation of cycling performance, where fatigue is one of the limiting factors. In particular, the last chapter of this thesis can be considered as a paradigm: physiological, metabolic, environmental, psychological and biomechanical factors influence the performance of a cyclist and only when all of these are kept together in a novel integrative way it is possible to derive a clear model and make correct assessments.