924 resultados para Input and outputs


Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the existing business models is the family business. This paper deals with the structure of family businesses. With the purpose to analyzing the input and output processes of goods, and management behavior in decision-making, looking for the professionalization of the family business administration. This paper is a case study in a construction materials shop, with an applied research due to their nature, according to his purpose takes the characteristic of exploratory research, using interviews, observation and analysis of company data. During the analysis of processes in the company were noted some shortcomings. Given this fact, made the comparison with the theories found, proposals were generated improvements to be accomplished within the company structure

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a Project of an automatic feeder for pets using an Arduino Uno as the control center. Through studies on this driver was possible to create a device with an interface capable to receiving user input and then use it to activate the feeder in the defined hours. For mounting equipment were used steps motors, sensors, keyboard and display, which work together to instrument operation. The project goal was reached and the prototype developed indicating that the Arduino can be used for various applications that can simplify daily tasks

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The pharmaceutical industry was consolidated in Brazil in the 1930s, and since then has become increasingly competitive. Therefore the implementation of the Toyota Production System, which aims to lean production, has become common among companies in the segment. The main efficiency indicator currently used is the Overall Equipment Effectiveness (OEE). This paper intends to, using the fuzzy model DEA-BCC, analyze the efficiency of the production lines of a pharmaceutical company in the Paraíba Valley, compare the values obtained by the model with those calculated by the OEE, identify the most sensitive machines to variation in the data input and develop a ranking of effectiveness between the consumer machinery. After the development, it is shown that the accuracy of the relationship between the two methods is approximately 57% and the line considered the most effective by the Toyota Production System is not the same as the one found by this paper

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nutrient distributions observed at some depths along the continental shelf from 27 degrees 05`S (Brazil) to 39 degrees 31`S (Argentina) in winter, 2003 and summer, 2004 related to salinity and dissolved oxygen (mL L-1) and saturation (%) data showed remarkable influences of fresh water discharge over the coastal region and in front of the La Plata estuary. In the southern portion of the study area different processes were verified. Upwelling processes caused by ocean dynamics typical of shelf break areas, eddies related to surface dynamics and regeneration processes confirmed by the increase of nutrients and the decrease of dissolved and saturation oxygen data were verified. High silicate concentrations in the surface waters were identified related to low salinities (minimum of 21.22 in winter and 21.96 in summer), confirming the importance of freshwater inputs in this region, especially in winter. Silicate concentration range showed values between 0.00 and 83.52 mu M during winter and from 0.00 to 41.16 mu M during summer. Phosphate concentrations worked as a secondary trace of terrestrial input and their values varied from 0.00 to 3.30 mu M in winter and from 0.03 to 2.26 mu M in summer; however, in shallow waters, phosphate indicated more clearly the fresh water influence. The most important information given by nitrate concentrations was the presence of water from SACW upwelling that represents a new source of nutrients for marine primary production. Nitrate maximum values reached 41.96 M in winter and 33.10 mu M in summer. At a depth similar to 800m, high nitrate, phosphate and silicate concentrations were related to Malvinas Current Waters, Subantarctic Shallow Waters and Antarctic Atlantic Intermediate Waters (AAIW). Dissolved oxygen varied from 3.41 to 7.06 mL L-1 in winter and from 2.65 to 6.85 mL L-1 in summer. The percentage of dissolved oxygen saturation in the waters showed values between 48% and 113% in winter and from 46% to 135% in summer. The most important primary production was verified in the summer, and situations of undersaturation were mainly observed below 50 m depth and at some points near the coast. The anti-correlation between nutrients and dissolved oxygen which showed evident undersaturation also revealed important potential sites of remineralization processes. The nutrient behaviours showed some aspects of the processes that occur over the Southwestern South Atlantic continental shelf and in their land-sea interfaces between Mar del Plata and Itajai.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objetivo deste trabalho foi quantificar o aporte e a remoção de nutrientes em sistemas de cultivo  de cana‑de‑açúcar irrigados, ou não, com efluente de estação de tratamento de esgoto (EETE), com e sem  adição  de fosfogesso,  bem  como  avaliar  os  efeitos  desses sistemas  de  cultivo  no  estado  nutricional  das  plantas. Foram avaliados tratamentos sem irrigação e com irrigação a 100 e 150% da necessidade hídrica da  cultura. Os tratamentos com fosfogesso foram aplicados em área de terceiro corte, irrigada com EETE desde  o plantio. As  avaliações foram realizadas em duas safras. Os tratamentos não afetaram os rendimentos de  colmos. O tratamento com EETE e fosfogesso apresentou efeito sinérgico sobre o conteúdo de nitrogênio e de  enxofre nas plantas. O EETE beneficiou a nutrição das plantas quanto ao fósforo, mas não causou melhorias  na nutrição com potássio e enxofre. A nutrição com ferro, zinco e manganês não foi influenciada pelo aporte  desses micronutrientes pelo EETE. O  fósforo e o nitrogênio aportados na irrigação com EETE devem ser  considerados na recomendação de adubação. Porém, potássio, enxofre, ferro, zinco e manganês do efluente não  são fontes eficientes desses nutrientes para as plantas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents the development of a mathematical model to optimize the management and operation of the Brazilian hydrothermal system. The system consists of a large set of individual hydropower plants and a set of aggregated thermal plants. The energy generated in the system is interconnected by a transmission network so it can be transmitted to centers of consumption throughout the country. The optimization model offered is capable of handling different types of constraints, such as interbasin water transfers, water supply for various purposes, and environmental requirements. Its overall objective is to produce energy to meet the country's demand at a minimum cost. Called HIDROTERM, the model integrates a database with basic hydrological and technical information to run the optimization model, and provides an interface to manage the input and output data. The optimization model uses the General Algebraic Modeling System (GAMS) package and can invoke different linear as well as nonlinear programming solvers. The optimization model was applied to the Brazilian hydrothermal system, one of the largest in the world. The system is divided into four subsystems with 127 active hydropower plants. Preliminary results under different scenarios of inflow, demand, and installed capacity demonstrate the efficiency and utility of the model. From this and other case studies in Brazil, the results indicate that the methodology developed is suitable to different applications, such as planning operation, capacity expansion, and operational rule studies, and trade-off analysis among multiple water users. DOI: 10.1061/(ASCE)WR.1943-5452.0000149. (C) 2012 American Society of Civil Engineers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Model predictive control (MPC) applications in the process industry usually deal with process systems that show time delays (dead times) between the system inputs and outputs. Also, in many industrial applications of MPC, integrating outputs resulting from liquid level control or recycle streams need to be considered as controlled outputs. Conventional MPC packages can be applied to time-delay systems but stability of the closed loop system will depend on the tuning parameters of the controller and cannot be guaranteed even in the nominal case. In this work, a state space model based on the analytical step response model is extended to the case of integrating time systems with time delays. This model is applied to the development of two versions of a nominally stable MPC, which is designed to the practical scenario in which one has targets for some of the inputs and/or outputs that may be unreachable and zone control (or interval tracking) for the remaining outputs. The controller is tested through simulation of a multivariable industrial reactor system. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The role of the amygdala in the mediation of fear and anxiety has been extensively investigated. However, how the amygdala functions during the organization of the anxiety-like behaviors generated in the elevated plus maze (EPM) is still under investigation. The basolateral (BLA) and the central (CeA) nuclei are the main input and output stations of the amygdala. In the present study, we ethopharmacologically analyzed the behavior of rats subjected to the EPM and the tissue content of the monoamines dopamine (DA) and serotonin (5-HT) and their metabolites in the nucleus accumbens (NAc), dorsal hippocampus (DH), and dorsal striatum (DS) of animals injected with saline or midazolam (20 and 30 nmol/0.2 mu L) into the BLA or CeA. Injections of midazolam into the CeA, but not BLA, caused clear anxiolytic-like effects in the EPM. These treatments did not cause significant changes in 5-HT or DA contents in the NAc, DH, or DS of animals tested in the EPM. The data suggest that the anxiolytic-like effects of midazolam in the EPM also appear to rely on GABA-benzodiazepine mechanisms in the CeA, but not BLA, and do not appear to depend on 5-HT and DA mechanisms prevalent in limbic structures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La questione energetica ha assunto, negli ultimi anni, un ruolo centrale nel dibattito mondiale in relazione a quattro fattori principali: la non riproducibilità delle risorse naturali, l’aumento esponenziale dei consumi, gli interessi economici e la salvaguardia dell'equilibrio ambientale e climatico del nostro Pianeta. E’ necessario, dunque, cambiare il modello di produzione e consumo dell’energia soprattutto nelle città, dove si ha la massima concentrazione dei consumi energetici. Per queste ragioni, il ricorso alle Fonti Energetiche Rinnovabili (FER) si configura ormai come una misura necessaria, opportuna ed urgente anche nella pianificazione urbanistica. Per migliorare la prestazione energetica complessiva del sistema città bisogna implementare politiche di governo delle trasformazioni che escano da una logica operativa “edificio-centrica” e ricomprendano, oltre al singolo manufatto, le aggregazioni di manufatti e le loro relazioni/ interazioni in termini di input e output materico-energetiche. La sostituzione generalizzata del patrimonio edilizio esistente con nuovi edifici iper-tecnologici, è improponibile. In che modo quindi, è possibile ridefinire la normativa e la prassi urbanistica per generare tessuti edilizi energeticamente efficienti? La presente ricerca propone l’integrazione tra la nascente pianificazione energetica del territorio e le più consolidate norme urbanistiche, nella generazione di tessuti urbani “energy saving” che aggiungano alle prestazioni energetico-ambientali dei singoli manufatti quelle del contesto, in un bilancio energetico complessivo. Questo studio, dopo aver descritto e confrontato le principali FER oggi disponibili, suggerisce una metodologia per una valutazione preliminare del mix di tecnologie e di FER più adatto per ciascun sito configurato come “distretto energetico”. I risultati di tale processo forniscono gli elementi basilari per predisporre le azioni necessarie all’integrazione della materia energetica nei Piani Urbanistici attraverso l’applicazione dei principi della perequazione nella definizione di requisiti prestazionali alla scala insediativa, indispensabili per un corretto passaggio alla progettazione degli “oggetti” e dei “sistemi” urbani.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Questo studio, che è stato realizzato in collaborazione con Hera, è un'analisi della gestione dei rifiuti a Bologna. La ricerca è stata effettuata su diversi livelli: un livello strategico il cui scopo è quello di identificare nuovi metodi per la raccolta dei rifiuti in funzione delle caratteristiche del territorio della città, un livello analitico che riguarda il miglioramento delle applicazioni informatiche di supporto, e livello ambientale che riguarda il calcolo delle emissioni in atmosfera di veicoli adibiti alla raccolta e al trasporto dei rifiuti. innanzitutto è stato necessario studiare Bologna e lo stato attuale dei servizi di raccolta dei rifiuti. È incrociando questi componenti che in questi ultimi tre anni sono state effettuate modifiche nel settore della gestione dei rifiuti. I capitoli seguenti sono inerenti le applicazioni informatiche a sostegno di tali attività: Siget e Optit. Siget è il programma di gestione del servizio, che attualmente viene utilizzato per tutte le attività connesse alla raccolta di rifiuti. È un programma costituito da moduli diversi, ma di sola la gestione dati. la sperimentazione con Optit ha aggiunto alla gestione dei dati la possibilità di avere tali dati in cartografia e di associare un algoritmo di routing. I dati archiviati in Siget hanno rappresentato il punto di partenza, l'input, e il raggiungimento di tutti punti raccolta l'obiettivo finale. L'ultimo capitolo è relativo allo studio dell'impatto ambientale di questi percorsi di raccolta dei rifiuti. Tale analisi, basata sulla valutazione empirica e sull'implementazione in Excel delle formule del Corinair mostra la fotografia del servizio nel 2010. Su questo aspetto Optit ha fornito il suo valore aggiunto, implementando nell'algoritmo anche le formule per il calcolo delle emissioni.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this report a new automated optical test for next generation of photonic integrated circuits (PICs) is provided by the test-bed design and assessment. After a briefly analysis of critical problems of actual optical tests, the main test features are defined: automation and flexibility, relaxed alignment procedure, speed up of entire test and data reliability. After studying varied solutions, the test-bed components are defined to be lens array, photo-detector array, and software controller. Each device is studied and calibrated, the spatial resolution, and reliability against interference at the photo-detector array are studied. The software is programmed in order to manage both PIC input, and photo-detector array output as well as data analysis. The test is validated by analysing state-of-art 16 ports PIC: the waveguide location, current versus power, and time-spatial power distribution are measured as well as the optical continuity of an entire path of PIC. Complexity, alignment tolerance, time of measurement are also discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Deutschland wird zur oralen Vitamin-K-Antagonistentherapie überwiegend der Wirkstoff Phenprocoumon (PPC) eingesetzt und die meisten Patienten werden durch ihren Hausarzt betreut. In einer deskriptiven, nicht-interventionellen Studie wurde die Ist-Situation der Versorgung von PPC-Patienten im ambulanten Sektor untersucht. Ziel war es, die Qualität und Effektivität der bisherigen Standardtherapie zu evaluieren. In Anbetracht der Einführung der neuen oralen Antikoagulantien (NOAC) ist die Untersuchung der PPC-Therapie von besonderem Interesse. Dem „Throughput-Modell“ folgend sollten „Input“- und „Outcome“-Parameter analysiert werden. rnIn einer klinischen Studie wurden 50 ambulant behandelte Patienten mit PPC-Therapie jeweils über einen Zeitraum von 3 Jahren retrospektiv beobachtet. In 5 niedergelassenen Arztpraxen in Rheinland-Pfalz wurden dazu 10 Patienten pro Praxis rekrutiert. Anhand der Patientenakte wurde eine Dokumentenanalyse durchgeführt. Die Selbstmedikation wurde mit einem eigens erstellten Fragebogen erfasst. rnIm Studienkollektiv wurden im Median 3 Comorbiditäten ermittelt. Die mediane Wochendosis betrug 4,0 Tabletten à 3 mg PPC. Die Patienten wurden im Median mit weiteren 15 verschiedenen Wirkstoffen therapiert, einer davon wurde in Selbstmedikation eingenommen. Im gesamten Beobachtungszeitraum fanden pro Patient im Median 57 Arztbesuche statt, die durch die Phenprocoumon-Therapie bedingt waren. INR (International normalized ratio)-Messungen (Median 47) waren der häufigste Grund für die Arztbesuche, so dass ein 3-Wochen-Rhythmus vom Gesamtkollektiv zu 97% erreicht wurde. Die „stabile“ INR-Einstellung wurde im Median nach 94 Tagen erreicht. Die prozentuale Rate (INR (%)) für die Einhaltung des INR-Zielbereiches (ZSB) erreichte internationale Benchmark-Werte, was auf eine gute Versorgungsqualität hindeutete. Die genauere Analyse ergab jedoch große interindividuelle Schwankungen. Während der „stabilen“ INR-Einstellung wurden bessere Ergebnisse als im Gesamtbeobachtungszeitraum erzielt. Drei Patienten (6%) erreichten die „stabile“ INR-Einstellung innerhalb von 3 Jahren nie. Die Auswertung für den erweiterten ZSB (ZSB ± 0,2) ergab bessere INR (%)-Ergebnisse als für den ZSB. Die Zeit im INR-ZSB (TTR (%)) erreichte mit 75% höhere Werte als INR (%) im ZSB mit 70%. Tendenziell war das Patientenkollektiv eher unter- als übertherapiert (Median „Under-INR“ 18% bzw. „Over-INR“ 8%). Erkrankungen und Impfungen stellten die wichtigsten der zahlreichen Einflussfaktoren für INR-Shifts hin zu Werten außerhalb des ZSB dar. Patienten, die Comedikation mit hohem Interaktionspotential einnahmen, erreichten in den INR-Qualitätsindikatoren schlechtere Ergebnisse als Patienten ohne potentiell interagierende Comedikation (Mann-Whitney-U-Test; p-Wert=0,003 für TTR (%), p=0,008 für INR (%)). In Zeitintervallen der „stabilen“ INR-Einstellung war der Unterschied nur für TTR (%) auffällig (Mann-Whitney-U-Test; p=0,015). Für den erweiterten ZSB waren die Unterschiede bezüglich beider INR-Qualitätsindikatoren nicht auffällig. Insgesamt wurden 41 unerwünschte Ereignisse (UAW) beobachtet, davon 24 (59%) in der Phase der „stabilen“ INR-Einstellung (21 leichte Blutungen, 1 schwere Blutung, 2 thromboembolische Ereignisse (TE)). Je 4 leichte Blutungen (19%) wurden in einen möglichen bzw. sicheren kausalen Zusammenhang mit der VKA-Therapie gebracht, wenn ein Zeitintervall von 3 Tagen zwischen der INR-Messung und Auftreten der UAW geprüft wurde. Ein TE wurde als sicher kausal gewertet. Von insgesamt 5 Krankenhausaufenthalten waren 3 bzw. 2 durch Blutungen bzw. TE veranlasst. Des Weiteren wurde im 3-Tage-Zeitintervall für 4 INR-Shifts hin zu Werten außerhalb des ZSB eine Interaktion mit verordneter CM als in sicherem oder möglichem kausalen Zusammenhang bewertet. Bei 49% der beobachteten Grippeimpfungen wurde ein INR-Shift festgestellt, der in ca. 60% der Fälle zu einem subtherapeutischen INR-Wert führte. Insgesamt war das klinische Ergebnis nicht optimal. rnDas „Outcome“ in Form der gesundheitsbezogenen Lebensqualität (LQ) wurde retrospektiv-prospektiv mittels SF-36-Fragebogen ermittelt. Die Patienten zeigten gegenüber der Normalbevölkerung einen Verlust an LQ auf körperlicher Ebene bzw. einen Gewinn auf psychischer Ebene. Das humanistische Ergebnis erfüllte bzw. übertraf damit die Erwartungen. rnInsgesamt wiesen die Ergebnisse darauf hin, dass Qualität und Effektivität der Antikoagulationstherapie mit PPC im ambulanten Sektor weiterer Optimierung bedürfen. Mit intensivierten Betreuungsmodellen lässt sich ein besseres Outcome erzielen. rn

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Troposphärisches Ozon ist bekannt als wichtiges Oxidationsmittel und als Vorläufergas hoch reaktiver Radikale. Es zählt zu den wichtigsten Treibhausgasen und wirkt bei hohen Konzentrationen an der Erdoberfläche giftig für alle Lebewesen. Zwar wird der Großteil des troposphärischen Ozons photochemisch produziert, ein erheblicher Anteil hat aber stratosphärischen Ursprung und wird entlang von Tropopausenfalten in Zyklonen in die Troposphäre transportiert. Dieser Transport von Luftmassen aus der Stratosphäre in diernTroposphäre (STT) kann zu einem kurzzeitigen, starken Ozonanstieg am Boden führen und langfristig die Chemie der Troposphäre beeinflussen. Die Quantifizierung des Ozoneintrages und die Identifizierung der dafür verantwortlichen Prozesse ist mit großen Unsicherheiten belastet und ein aktuelles Forschungsthema.rnAufgrund ihrer groben Auflösung ist es mit globalen Modellen nicht möglich, die Details dieser STT-Prozesse zu erfassen. Deshalb wird in dieser Arbeit das Modellsystem MECO(n) genutzt, welches das regionale Atmosphärenchemie- und Klimamodell COSMO/MESSy mit dem globalen Chemie-Klimamodell ECHAM5/MESSy (EMAC) koppelt. Eine einheitliche Prozessparametrisierung ermöglicht konsistente, simultane Simulationen in verschiedenen Modellauflösungen. Ein als Teil dieser Arbeit neu entwickeltes Submodell erlaubt die Initialisierung künstlicher, passiver Tracer in Abhängigkeit verschiedener Variablen. Mit einem auf diese Weise freigesetzten, stratosphärischen Tracer lässt sich Ozon mit stratosphärischer Herkunft von solchem, das photochemisch produziert wurde, unterscheiden.rnIm Rahmen einer Fallstudie werden die Austauschprozesse an einer Tropopausenfalte sowohl aus der Eulerischen, als auch aus der Lagrangeschen Perspektive betrachtet. Die Analyse der STT-Prozesse zeigt, dass Luftmassen aus der Stratosphäre durch turbulente und diabatische Prozesse am Rand der Tropopausenfalte in die Troposphäre gelangen und anschließend bis zum Boden transportiert werden. Diese absinkenden, stratosphärischen Luftmassen führen in den Simulationen zu Ozonanstiegen am Boden, die mit Beobachtungsdaten evaluiert werden können. Es wird gezeigt, dass die Ergebnisse der feiner auflösendenrnModellinstanz gut mit den Messungen übereinstimmen.rnIn einer Lagrangeschen Analyse lassen sich Mischungszeitskalen für STT-Prozesse bestimmen. Es wird gezeigt, dass Luftpakete, die sich länger als zehn Stunden in der Troposphäre aufhalten, diese durch den Eintrag ihrer stratosphärischen Tracereigenschaften beeinflussen und daher nicht vernachlässigbar sind. Eine weitere Studie gibt Aufschluss über die Effektivität der Mischung an Tropopausenfalten: Fast die gesamte Luftmasse, die sich zu einem bestimmten Zeitpunkt in der Tropopausenfalte befindet, gelangt innerhalb von zwei Tagen in die Troposphäre.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study evaluated how applicable European Life Cycle Inventory (LCI) data are to assessing the environmental impacts of the life cycle of Brazilian triple superphosphate (TSP). The LCI data used for the comparison were local Brazilian LCI data, European LCI data in its original version from the ecoinvent database and a modified version of the European LCI data, which had been adapted to better account for the Brazilian situation. We compared the three established datasets at the level of the inventory as well as for their environmental impacts, i.e. at the level of Life Cycle Environmental Assessment (LCIA). The analysis showed that the European LCIs (both the original and the modified ones) considered a broader spectrum of background processes and environmental flows (inputs and outputs). Nevertheless, TSP production had in all three cases similar values for the consumption of the main raw materials. The LCIA results obtained for the datasets showed important differences as well. Therefore we concluded that the European data in general lead to much higher environmental impacts than the Brazilian data. The differences between the LCIA results obtained with the Brazilian and the European data can be basically explained by the methodological differences underlying the data. The small differences at the LCI level for selected inputs and outputs between the Brazilian and the European LCIs from ecoinvent indicate that the latter can be regarded as applicable for characterizing the Brazilian TSP.