914 resultados para Using Lean tools
Resumo:
Os espaços verdes públicos urbanos são muito importantes no contexto urbano. Influenciam de diversas formas na qualidade de vida das populações, proporcionando benefícios ambientais, sociais e econômicos. A fim de avaliar a disponibilidade destes espaços na cidade de Bragança, foram realizadas análises utilizando indicadores, com apoio dos software ArcGIS 9.3 e QGis 2.14.0-Essen, que permitiram avaliar a oferta destes espaços nas suas diferentes tipologias e categorias dimensionais. Para o efeito foram aplicados os indicadores: Percentagem de espaços verdes, Espaços verdes per capita, Distância média, Índice de Área Verde por Área de Implantação e Índice de Área Verde por Área Coberta. Posteriormente, procedeu-se à aplicação de inquéritos a fim de avaliar as perceções e atitudes de uma amostra da população de Bragança, realizando análises descritivas e estatísticas, com recurso ao software SPSS17, buscando descrever a atual relação com os espaços verdes e usando testes não paramétricos para identificar diferenças entre subgrupos da amostra, numa análise centrada em dois níveis: a escala urbana e a escala de Bairro. Procurando avaliar possíveis alterações futuras, foram testados cenários realistas, um correspondendo à introdução de espaços verdes em terrenos na posse da Autarquia e outro considerando a ampliação das áreas verdes previstas no Plano de Urbanização de Bragança de 2010. Os Resultados permitiram identificar diferenças relevantes na oferta de espaços verdes da cidade. Aplicando os indicadores foi possível verificar que existe a concentração de espaços verdes de maior dimensão na zona central da cidade, denotando um claro desequilíbrio na introdução de novos espaços em zonas de expansão urbana. Os inquéritos aplicados possibilitaram constatar que os inquiridos que possuem maior disponibilidade de espaços verdes em seu bairro de residência apresentam respostas mais satisfatórias em relação a acessibilidade e a aparência visual e paisagística dos bairros. Da análise de cenários resulta que com a implantação de novos espaços verdes, para as duas análises, ocorreria uma melhoria da oferta e distribuição dos espaços verdes na cidade permitindo um maior reequilíbrio face à concentração na zona central, melhorando a acessibilidade para toda a população.
Resumo:
The land suitability evaluation is used to establish land zonings for agriculture activities. Geographic information systems (GIS) are useful for integrating different attributes necessaries to define apt and not apt lands. The present study had as main objective to describe procedures to define land suitability using GIS tools, soils maps and data soils profiles data, emphasizing procedures to define soil atributes. The area studied was the watershed of Córrego Espraiado, Ribeirão Preto-SP, located on the recharging area of the Guarani Aquifer, with approximately 4,130 ha and predominance of sugar cane culture. The database project was developed using the GIS Idrisi 32. The land suitability evaluation was done considering the intensive agricultural production system predominant in the watershed, adjusted for the vulnerability of the areas of recharge and for the methodology of GIS tools. Numerical terrain models (NTM) had been constructed for cation exchange capacity, basis saturation, clay content and silt+clay content using kriging (geostatistical interpolator), and for aluminum saturation using the inverse-square-distance. Boolean operations for handling geographic fields (thematic maps and NTM) to produce information plans are described and a land suitability map obtained by GIS tools is presented, indicating that 85% of watershed lands are apt to annual cultures.
Resumo:
Human activities are altering greenhouse gas concentrations in the atmosphere and causing global climate change. The issue of impacts of human-induced climate change has become increasingly important in recent years. The objective of this work was to develop a database of climate information of the future scenarios using a Geographic Information System (GIS) tools. Future scenarios focused on the decades of the 2020?s, 2050?s, and 2080?s (scenarios A2 and B2) were obtained from the General Circulation Models (GCM) available on Data Distribution Centre from the Third Assessment Report (TAR) of Intergovernmental Panel on Climate Change (IPCC). The TAR is compounded by six GCM with different spatial resolutions (ECHAM4:2.8125×2.8125º, HadCM3: 3.75×2.5º, CGCM2: 3.75×3.75º, CSIROMk2b: 5.625×3.214º, and CCSR/NIES: 5.625×5.625º). The mean monthly of the climate variables was obtained by the average from the available models using the GIS spatial analysis tools (arithmetic operation). Maps of mean monthly variables of mean temperature, minimum temperature, maximum temperature, rainfall, relative humidity, and solar radiation were elaborated adopting the spatial resolution of 0.5° X 0.5° latitude and longitude. The method of elaborating maps using GIS tools allowed to evaluate the spatial and distribution of future climate assessments. Nowadays, this database is being used in studies of impacts of climate change on plant disease of Embrapa projects.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Biológicas, Programa de Pós-Graduação em Biologia Microbiana, 2016.
Resumo:
Several modern-day cooling applications require the incorporation of mini/micro-channel shear-driven flow condensers. There are several design challenges that need to be overcome in order to meet those requirements. The difficulty in developing effective design tools for shear-driven flow condensers is exacerbated due to the lack of a bridge between the physics-based modelling of condensing flows and the current, popular approach based on semi-empirical heat transfer correlations. One of the primary contributors of this disconnect is a lack of understanding caused by the fact that typical heat transfer correlations eliminate the dependence of the heat transfer coefficient on the method of cooling employed on the condenser surface when it may very well not be the case. This is in direct contrast to direct physics-based modeling approaches where the thermal boundary conditions have a direct and huge impact on the heat transfer coefficient values. Typical heat transfer correlations instead introduce vapor quality as one of the variables on which the value of the heat transfer coefficient depends. This study shows how, under certain conditions, a heat transfer correlation from direct physics-based modeling can be equivalent to typical engineering heat transfer correlations without making the same apriori assumptions. Another huge factor that raises doubts on the validity of the heat-transfer correlations is the opacity associated with the application of flow regime maps for internal condensing flows. It is well known that flow regimes influence heat transfer rates strongly. However, several heat transfer correlations ignore flow regimes entirely and present a single heat transfer correlation for all flow regimes. This is believed to be inaccurate since one would expect significant differences in the heat transfer correlations for different flow regimes. Several other studies present a heat transfer correlation for a particular flow regime - however, they ignore the method by which extents of the flow regime is established. This thesis provides a definitive answer (in the context of stratified/annular flows) to: (i) whether a heat transfer correlation can always be independent of the thermal boundary condition and represented as a function of vapor quality, and (ii) whether a heat transfer correlation can be independently obtained for a flow regime without knowing the flow regime boundary (even if the flow regime boundary is represented through a separate and independent correlation). To obtain the results required to arrive at an answer to these questions, this study uses two numerical simulation tools - the approximate but highly efficient Quasi-1D simulation tool and the exact but more expensive 2D Steady Simulation tool. Using these tools and the approximate values of flow regime transitions, a deeper understanding of the current state of knowledge in flow regime maps and heat transfer correlations in shear-driven internal condensing flows is obtained. The ideas presented here can be extended for other flow regimes of shear-driven flows as well. Analogous correlations can also be obtained for internal condensers in the gravity-driven and mixed-driven configuration.
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.
Resumo:
Hazardous materials are substances that, if not regulated, can pose a threat to human populations and their environmental health, safety or property when transported in commerce. About 1.5 million tons of hazardous material shipments are transported by truck in the US annually, with a steady increase of approximately 5% per year. The objective of this study was to develop a routing tool for hazardous material transport in order to facilitate reduced environmental impacts and less transportation difficulties, yet would also find paths that were still compelling for the shipping carriers as a matter of trucking cost. The study started with identification of inhalation hazard impact zones and explosion protective areas around the location of hypothetical hazardous material releases, considering different parameters (i.e., chemicals characteristics, release quantities, atmospheric condition, etc.). Results showed that depending on the quantity of release, chemical, and atmospheric stability (a function of wind speed, meteorology, sky cover, time and location of accidents, etc.) the consequence of these incidents can differ. The study was extended by selection of other evaluation criteria for further investigation because health risk as an evaluation criterion would not be the only concern in selection of routes. Transportation difficulties (i.e., road blockage and congestion) were incorporated as important factor due to their indirect impact/cost on the users of transportation networks. Trucking costs were also considered as one of the primary criteria in selection of hazardous material paths; otherwise the suggested routes would have not been convincing for the shipping companies. The last but not least criterion was proximity of public places to the routes. The approach evolved from a simple framework to a complicated and efficient GIS-based tool able to investigate transportation networks of any given study area, and capable of generating best routing options for cargos. The suggested tool uses a multi-criteria-decision-making method, which considers the priorities of the decision makers in choosing the cargo routes. Comparison of the routing options based on each criterion and also the overall suitableness of the path in regards to all the criteria (using a multi-criteria-decision-making method) showed that using similar tools as the one proposed by this study can provide decision makers insights in the area of hazardous material transport. This tool shows the probable consequences of considering each path in a very easily understandable way; in the formats of maps and tables, which makes the tradeoffs of costs and risks considerably simpler, as in some cases slightly compromising on trucking cost may drastically decrease the probable health risk and/or traffic difficulties. This will not only be rewarding to the community by making cities safer places to live, but also can be beneficial to shipping companies by allowing them to advertise as environmental friendly conveyors.
Resumo:
BACKGROUND Lactococcus garvieae is a bacterial pathogen that affects different animal species in addition to humans. Despite the widespread distribution and emerging clinical significance of L. garvieae in both veterinary and human medicine, there is almost a complete lack of knowledge about the genetic content of this microorganism. In the present study, the genomic content of L. garvieae CECT 4531 was analysed using bioinformatics tools and microarray-based comparative genomic hybridization (CGH) experiments. Lactococcus lactis subsp. lactis IL1403 and Streptococcus pneumoniae TIGR4 were used as reference microorganisms. RESULTS The combination and integration of in silico analyses and in vitro CGH experiments, performed in comparison with the reference microorganisms, allowed establishment of an inter-species hybridization framework with a detection threshold based on a sequence similarity of >or= 70%. With this threshold value, 267 genes were identified as having an analogue in L. garvieae, most of which (n = 258) have been documented for the first time in this pathogen. Most of the genes are related to ribosomal, sugar metabolism or energy conversion systems. Some of the identified genes, such as als and mycA, could be involved in the pathogenesis of L. garvieae infections. CONCLUSIONS In this study, we identified 267 genes that were potentially present in L. garvieae CECT 4531. Some of the identified genes could be involved in the pathogenesis of L. garvieae infections. These results provide the first insight into the genome content of L. garvieae.
Resumo:
Adults of most marine benthic and demersal fish are site-attached, with the dispersal of their larval stages ensuring connectivity among populations. In this study we aimed to infer spatial and temporal variation in population connectivity and dispersal of a marine fish species, using genetic tools and comparing these with oceanographic transport. We focused on an intertidal rocky reef fish species, the shore clingfish Lepadogaster lepadogaster, along the southwest Iberian Peninsula, in 2011 and 2012. We predicted high levels of self-recruitment and distinct populations, due to short pelagic larval duration and because all its developmental stages have previously been found near adult habitats. Genetic analyses based on microsatellites countered our prediction and a biophysical dispersal model showed that oceanographic transport was a good explanation for the patterns observed. Adult sub-populations separated by up to 300 km of coastline displayed no genetic differentiation, revealing a single connected population with larvae potentially dispersing long distances over hundreds of km. Despite this, parentage analysis performed on recruits from one focal site within the Marine Park of Arrábida (Portugal), revealed self-recruitment levels of 2.5% and 7.7% in 2011 and 2012, respectively, suggesting that both long- and short-distance dispersal play an important role in the replenishment of these populations. Population differentiation and patterns of dispersal, which were highly variable between years, could be linked to the variability inherent in local oceanographic processes. Overall, our measures of connectivity based on genetic and oceanographic data highlight the relevance of long-distance dispersal in determining the degree of connectivity, even in species with short pelagic larval durations.
Resumo:
Current workplace demands newer forms of literacies that go beyond the ability to decode print. These involve not only competence to operate digital tools, but also the ability to create, represent, and share meaning in different modes and formats; ability to interact, collaborate and communicate effectively using digital tools, and engage critically with technology for developing one’s knowledge, skills, and full participation in civic, economic, and personal matters. This essay examines the application of the ecology of resources (EoR) model for delivering language learning outcomes (in this case, English) through blended classroom environments that use contextually available resources. The author proposes the implementation of the EoR model in blended learning environments to create authentic and sustainable learning environments for skilling courses. Applying the EoR model to Indian skilling instruction contexts, the article discusses how English language and technology literacy can be delivered using contextually available resources through a blended classroom environment. This would facilitate not only acquisition of language and digital literacy outcomes, but also consequent content literacy gain to a certain extent. This would ensure satisfactory achievement of not only communication/language literacy and technological literacy, but also active social participation, lifelong learning, and learner autonomy.
Resumo:
Regarding canal management modernization, water savings and water delivery quality, the study presents two automatic canal control approaches of the PI (Proportional and Integral) type: the distant and the local downstream control modes. The two PI controllers are defined, tuned and tested using an hydraulic unsteady flow simulation model, particularly suitable for canal control studies. The PI control parameters are tuned using optimization tools. The simulations are done for a Portuguese prototype canal and the PI controllers are analyzed and compared considering a demand-oriented-canal operation. The paper presents and analyzes the two control modes answers for five different offtake types – gate controlled weir, gate controlled orifice, weir with or without adjustable height and automatic flow adjustable offtake. The simulation results are compared using water volumes performance indicators (considering the demanded, supplied and the effectives water volumes) and a time indicator, defined taking into account the time during which the demand discharges are effective discharges. Regarding water savings, the simulation results for the five offtake types prove that the local downstream control gives the best results (no water operational losses) and that the distant downstream control presents worse results in connection with the automatic flow adjustable offtakes. Considering the water volumes and time performance indicators, the best results are obtained for the automatic flow adjustable offtakes and the worse for the gate controlled orifices, followed by the weir with adjustable height.
Resumo:
Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.
Resumo:
The main purpose of this paper is to present architecture of automated system that allows monitoring and tracking in real time (online) the possible occurrence of faults and electromagnetic transients observed in primary power distribution networks. Through the interconnection of this automated system to the utility operation center, it will be possible to provide an efficient tool that will assist in decisionmaking by the Operation Center. In short, the desired purpose aims to have all tools necessary to identify, almost instantaneously, the occurrence of faults and transient disturbances in the primary power distribution system, as well as to determine its respective origin and probable location. The compilations of results from the application of this automated system show that the developed techniques provide accurate results, identifying and locating several occurrences of faults observed in the distribution system.
Resumo:
This study investigated the organic and inorganic constituents of healthy leaves and Candidatus Liberibacter asiaticus (CLas)-inoculated leaves of citrus plants. The bacteria CLas are one of the causal agents of citrus greening (or Huanglongbing) and its effect on citrus leaves was investigated using laser-induced breakdown spectroscopy (LIBS) combined with chemometrics. The information obtained from the LIBS spectra profiles with chemometrics analysis was promising for the construction of predictive models to identify healthy and infected plants. The major, macro- and microconstituents were relevant for differentiation of the sample conditions. The models were then applied to different inoculation times (from 1 to 8 months). The models were effective in the classification of 82-97% of the diseased samples with a 95% significance level. The novelty of this method was in the fingerprinting of healthy and diseased plants based on their organic and inorganic contents. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The complex design and development of a planar multilayer phased array antenna in microstrip technology can be simplified using two commercially available design tools 1) Ansoft Ensemble and 2) HP-EEsof Touchstone. In the approach presented here, Touchstone is used to design RF switches and phase shifters whose scattering parameters are incorporated in Ensemble simulations using its black box tool. Using this approach, Ensemble is able to fully analyze the performance of radiating and beamforming layers of a phased array prior to its manufacturing. This strategy is demonstrated in a design example of a 12-element linearly-polarized circular phased array operating at L band. A comparison between theoretical and experimental results of the array is demonstrated.