961 resultados para Combustion pro-cess


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fuel cells are electrochemical energy conversion devices that convert fuel and oxidant electrochemically into electrical energy, water and heat. Compared to traditional electricity generation technologies that use combustion processes to convert fuel into heat, and then into mechanical energy, fuel cells convert the hydrogen and oxygen chemical energy into electrical energy, without intermediate conversion processes, and with higher efficiency. In order to make the fuel cells an achievable and useful technology, it is firstly necessary to develop an economic and efficient way for hydrogen production. Molecular hydrogen is always found combined with other chemical compounds in nature, so it must be isolated. In this paper, the technical, economical and ecological aspects of hydrogen production by biogas steam reforming are presented. The economic feasibility calculation was performed to evaluate how interesting the process is by analyzing the investment, operation and maintenance costs of the biogas steam reformer and the hydrogen production cost achieved the value of 0.27 US$/kWh with a payback period of 8 years. An ecological efficiency of 94.95%, which is a good ecological value, was obtained. The results obtained by these analyses showed that this type of hydrogen production is an environmentally attractive route. © 2013 Elsevier Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Protein aggregation became a widely accepted marker of many polyQ disorders, including Machado-Joseph disease (MJD), and is often used as readout for disease progression and development of therapeutic strategies. The lack of good platforms to rapidly quantify protein aggregates in a wide range of disease animal models prompted us to generate a novel image processing application that automatically identifies and quantifies the aggregates in a standardized and operator-independent manner. We propose here a novel image processing tool to quantify the protein aggregates in a Caenorhabditis elegans (C. elegans) model of MJD. Confocal mi-croscopy images were obtained from animals of different genetic conditions. The image processing application was developed using MeVisLab as a platform to pro-cess, analyse and visualize the images obtained from those animals. All segmenta-tion algorithms were based on intensity pixel levels.The quantification of area or numbers of aggregates per total body area, as well as the number of aggregates per animal were shown to be reliable and reproducible measures of protein aggrega-tion in C. elegans. The results obtained were consistent with the levels of aggrega-tion observed in the images. In conclusion, this novel imaging processing applica-tion allows the non-biased, reliable and high throughput quantification of protein aggregates in a C. elegans model of MJD, which may contribute to a significant improvement on the prognosis of treatment effectiveness for this group of disor-ders

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diffusion Kurtosis Imaging (DKI) is a fairly new magnetic resonance imag-ing (MRI) technique that tackles the non-gaussian motion of water in biological tissues by taking into account the restrictions imposed by tissue microstructure, which are not considered in Diffusion Tensor Imaging (DTI), where the water diffusion is considered purely gaussian. As a result DKI provides more accurate information on biological structures and is able to detect important abnormalities which are not visible in standard DTI analysis. This work regards the development of a tool for DKI computation to be implemented as an OsiriX plugin. Thus, as OsiriX runs under Mac OS X, the pro-gram is written in Objective-C and also makes use of Apple’s Cocoa framework. The whole program is developed in the Xcode integrated development environ-ment (IDE). The plugin implements a fast heuristic constrained linear least squares al-gorithm (CLLS-H) for estimating the diffusion and kurtosis tensors, and offers the user the possibility to choose which maps are to be generated for not only standard DTI quantities such as Mean Diffusion (MD), Radial Diffusion (RD), Axial Diffusion (AD) and Fractional Anisotropy (FA), but also DKI metrics, Mean Kurtosis (MK), Radial Kurtosis (RK) and Axial Kurtosis (AK).The plugin was subjected to both a qualitative and a semi-quantitative analysis which yielded convincing results. A more accurate validation pro-cess is still being developed, after which, and with some few minor adjust-ments the plugin shall become a valid option for DKI computation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän tutkimuksen tarkoituksena on selvittää, miten riskienhallinnan näkökulma on huomioitu Suomessa toimivien suurten tilintarkastusyhteisöjen asiakashyväksyntäprosesseissa. Asiakashyväksyntä on tilintarkastusprosessin ensimmäinen vaihe. Tutkimuksen tiedonhankinnan strategiana käytettiin teemahaastattelua ja tutkimuksessa haastateltiin kolmea asiakashyväksynnän näkökulmasta eri rooleissa toimivaa tilintarkastajaa. Kirjallisuuskatsauksessa perehdytään sekä tilintarkastusprosessiin että riskienhallintaan liittyvään kirjallisuuteen. Tutkimuksen tuloksena todettiin, että tutkitut tilintarkastusyhteisöt ovat kehittäneet ja formalisoineet asiakashyväksyntäprosessejaan jatkuvasti riskienhallintanäkökulman huomioiden. Automatisoidut tietojärjestelmät ovat nousseet yhä merkittävämpään rooliin ja myös asiakashyväksyntään liittyvää päätöksentekopolkua on mietitty ja uudistettu. Tutkimuksen tuloksena vahvistui näkemys siitä, että aiemmissa tutkimuksissa esitetty toimeksiantoriskin käsite kuvaa myös suomalaisten tilintarkastajien asiakashyväksyntävaiheessa tekemiä riskiarvioita. Merkittävimpinä pidettiin erilaisia tilintarkastusyhteisön maineeseen vaikuttavia riskitekijöitä sekä asiakkaan henkilökysymyksiin liittyviä seikkoja. Tilintarkastajien todettiin tarkastelevan sekä riskin negatiivisia puolia että myös siihen liittyviä positiivisia mahdollisuuksia. Varsinaisia aktiivisia riskienhallinnan toimenpiteitä tärkeämmäksi koettiin se, että asiakkaaseen liittyvät riskit kartoitetaan ja arvioidaan asianmukaisesti. Asiakashyväksynnän merkityksestä riskienhallinnassa voidaan todeta, että sitä pidettiin niin sanottuna ensimmäisenä suojamuurina, jonka avulla voidaan pyrkiä suojautumaan erityisesti sellaisia riskejä vastaan, joihin ei enää myöhemmässä vaiheessa voida tilintarkastustoimenpiteillä vaikuttaa. Tällöin liian korkeariskiset asiakkaat jätetään hyväksymättä - ei niinkään pyritä suunnittelemaan riskiä alentavia toimenpiteitä, joiden avulla asiakas voitaisiin hyväksyä.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rakennusalan tuotehyväksyntä muuttuu EU:n rakennustuoteasetuksen astuessa voimaan 1.7.2013, jolloin CE-merkintä tulee pakolliseksi suurella osalle rakennustuotteita. Muutos aiheuttaa toimia yli 4000 suomalaisyrityksessä, jotka joutuvat hankkimaan tuotteelleen CE-merkinnän. Rakennustuoteasetus vaikuttaa kaikkiin alan toimijoihin. Tämä diplomityö on toteutettu toimintatutkimuksena, johon sisältyvien teemahaastattelujen kautta saatujen tutkimustulosten perusteella kehitettiin tuotehyväksynnän hallintaan työkalu, YIT-tuotehyväksyntäkortit. Ne laadittiin tuotehyväksynnän teoriaan pohjalta palvelemaan koko YIT Rakennus Oy:tä ja muokattiin yrityksen tarpeisiin sopiviksi soveltaen toimintatutkimuksen tuloksia. YIT-tuotehyväksyntäkortiston avulla pyritään kehittämään yrityksen toimintatapoja vaikuttamalla organisaation henkilöstöön osallistamalla ja vastuuttamalla heitä. Tuotehyväksynnän vaikutuksia rakennushankkeen eri vaiheisiin tutkittiin havainnoivan keskustelun avulla ja teemahaastatteluiden tuloksia analysoiden. YIT-tuotehyväksyntäkortistoa hallinnan välineenä testattiin pääasiassa haastattelujen avulla, sillä mallikortit eivät vielä mahdollistaneet laajempaa testausta käytännössä. Työn tuloksina esitellään YIT-tuotehyväksyntäkorttien laadintaprosessi ja paneudutaan tarkemmin korttien sisältöön.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As the competitive situation becomes more intense, companies must strive for growth and the ability to renew themselves. Innovations and the development of services will take an important role in this situation. Involving users in the service design process creates additional value and a chance to stand out from the com-petition. The goal of the target company is to produce high-quality services with the customers' needs in mind. This is why the company wants to develop their services systematically and aims to collect information about the service needs of both their employees and customers, which can be used to develop new and ap-pealing service packages. The aim of the study is to find out what the target company's current service pro-cess is like and how it can be developed further. The work is a case study in which the development process of the new service is followed through according to the approach, process and tools of the service design. Information is collected through focused interviews and a client survey. Participant observation gave a good overview of how employees of the target company react to the development of services. The study also found that the ser-vice path enables the collecting of ideas concerning the practical planning and development of the service process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän Pro Gradu-tutkielman tavoite on tutkia Business Process Re-engineering menetelmiä myyntiprosessien tehostamisessa. Tutkimuksen teoreettinen viiteke-hys rakentuu myynninjohtamisen, myyntiprosessien ja Business Process Mana-gementin ja Business Process Re-enineeringin ympärille. IT-järjestelmät ovat myös oleellinen osa-alue tutkimuksen kannalta ja niiden osuutta kuvataan niin myyntiprosesseissa kuin Business Process Re-engineering -menetelmien yhtey-dessä. Tutkielmassa perehdytään aikaisempaan tutkimusmateriaaliin ja akateemiseen kirjallisuuteen yllämainituilla osa-alueilla. Tavoitteena on löytää aikaisempia tutki-muksia myyntiprosessien tehostamisesta ja BPR:n roolista näissä tapauksissa. Myös myynninjohtamisen vaikutusta tehokkaaseen myyntiprosessiin tutkitaan, kuten myös IT-järjestelmien erilaisia rooleja tehokkaissa myyntiprosesseissa. Tutkielman empiirinen osio on kvalitatiivinen Case-tutkimus eräässä rahoitusalan yrityksessä. Tutkimus tehdään haastattelemalla myyntihenkilöstöä ja esimiehiä. Lisäksi analysoidaan yrityksen myyntiprosessiin liittyvää muuta materiaalia. Case-tutkimuksen tuloksia peilataan aiempaan akateemiseen tutkimukseen ja tuloksista pyritään löytämään ratkaisuja, miten BPR -menetelmillä voidaan tehostaa yrityksen myyntiprosessia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The first part of this thesis studied the capacity of amino acids and enzymes to catalyze the hydrolysis and condensation of tetraethoxysilane and phenyltrimethoxysilane. Selected amino acids were shown to accelerate the hydrolysis and condensation of tetraethoxysilane under ambient temperature, pressure and at neutral pH (pH 7±0.02). The nature of the side chain of the amino acid was important in promoting hydrolysis and condensation. Several proteases were shown to have a capacity to hydrolyze tri- and tet-ra- alkoxysilanes under the same mild reaction conditions. The second part of this thesis employed an immobilized Candida antarctica lipase B (Novozym-435, N435) to produce siloxane-containing polyesters, polyamides, and polyester amides under solvent-free conditions. Enzymatic activity was shown to be temperature dependent, increasing until enzyme denaturation became the dominant pro-cess, which typically occurred between 120-130ᵒC. The residual activity of N435 was, on average, greater than 90%, when used in the synthesis of disiloxane-containing polyesters, regardless of the polymerization temperature except at the very highest temperatures, 140-150ᵒC. A study of the thermal tolerance of N435 determined that, over ten reaction cycles, there was a decrease in the initial rate of polymerization with each consecutive use of the catalyst. No change in the degree of monomer conversion after a 24 hour reaction cycle was found.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tese aborda o processo de valorização da Zona Sul da cidade do Rio de Janeiro através de canções que enfocaram esse espaço e de crônicas, especialmente, publicadas no livro O Rio de Janeiro em prosa & verso, organizado por Manuel Bandeira e Carlos Drummond de Andrade. Para dar conta das complexas relações entre música e cidade, entre produção literária e cidade, este trabalho foi organizado em cinco capítulos. No primeiro capítulo, enfoquei as comemorações do IV Centenário da Cidade do Rio de Janeiro realizado em 1965, o carnaval desse ano dedicado à efeméride e algumas publicações motivadas pelo evento. No segundo, tratei da ocupação do espaço urbano carioca denominado Zona Sul, especialmente através das crônicas contidas no livro citado acima. O capítulo seguinte evidenciou a importância das boates, bares, cassinos e demais espaços de sociabilidade cultural na Zona Sul carioca, como lugares de formação e de vivência artística, onde eram tramadas importantes redes de solidariedade e filia. No quarto capítulo foram apresentadas algumas canções que priorizaram a cidade do Rio, as representações líteromusicais da Zona Sul, especialmente Copacabana como um dos bairros cariocas mais priorizados por esse acervo. Por fim, a Bossa Nova foi mostrada como momento indelével para o debate da identidade citadina carioca, sua projeção internacional e como o Rio foi apresentado pelas letras das canções.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This pap er analyzes the distribution of money holdings in a commo dity money search-based mo del with intermediation. Intro ducing heterogeneity of costs to the Kiyotaki e Wright ( 1989 ) mo del, Cavalcanti e Puzzello ( 2010) gives rise to a non-degenerated distribution of money. We extend further this mo del intro ducing intermediation in the trading pro cess. We show that the distribution of money matters for savings decisions. This gives rises to a xed p oint problem for the saving function that di cults nding the optimal solution. Through some examples, we show that this friction shrinks the distribution of money. In contrast to the Cavalcanti e Puzzello ( 2010 ) mo del, the optimal solution may not present the entire surplus going to the consumer. At the end of the pap er, we present a strong result, for a su cient large numb er of intermediaries the distribution of money is degenerated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Computação - IBILCE

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Die vorliegende Arbeit beschäftigt sich mit rechtlichen Fragestellungen rund um Bewertungs-portale im Internet. Zentrale Themen der Arbeit sind dabei die Zulässigkeit der Veröffentlichung der von den Nutzern abgegebenen Bewertungen vor dem Hintergrund möglicherweise entgegenstehender datenschutzrechtlicher Bestimmungen und der Persönlichkeitsrechte der Betroffenen. Des weiteren wird der Rechtsschutz der Betroffenen erörtert und in diesem Zusammenhang die haftungsrechtlichen Risiken der Forenbetreiber untersucht. Gegenstand der Arbeit sind dabei sowohl Online-Marktplätze wie eBay, auf denen sowohl der Bewertende als auch der Bewertete registriert und mit dem Bewertungsverfahren grundsätz-lich einverstanden sind (geschlossene Portale), als auch Portale, auf denen – oftmals unter einem Pseudonym und ohne vorherige Anmeldung – eine freie Bewertungsabgabe, zu Pro-dukteigenschaften, Dienstleistungen bis hinzu Persönlichkeitsmerkmalen des Bewerteten möglich ist (offene Portale). Einleitung und Erster Teil Nach einer Einleitung und Einführung in die Problematik werden im ersten Teil die verschie-denen Arten der Bewertungsportale kurz vorgestellt. Die Arbeit unterscheidet dabei zwischen so genannten geschlossenen Portalen (transaktionsbegleitende Portale wie eBay oder Ama-zon) auf der einen Seite und offenen Portalen (Produktbewertungsportale, Hotelbewertungs-portale und Dienstleistungsbewertungsportale) auf der anderen Seite. Zweiter Teil Im zweiten Teil geht die Arbeit der Frage nach, ob die Veröffentlichung der durch die Nutzer abgegebenen Bewertungen auf den offenen Portalen überhaupt erlaubt ist oder ob hier mögli-cherweise das Persönlichkeitsrecht der Betroffenen und hier insbesondere das Recht auf in-formationelle Selbstbestimmung in Form der datenschutzrechtlichen Bestimmungen die freie Bewertungsabgabe unzulässig werden lässt. Untersucht werden in diesem Zusammenhang im einzelnen Löschungs- bzw. Beseitigungsan-sprüche der Betroffenen aus § 35 Abs. 2 Satz 2 Nr. 1 BDSG bzw. §§ 1004 i. V. m. 823 Abs. 1 BGB (allgemeines Persönlichkeitsrecht). Die Arbeit kommt in datenschutzrechtlicher Hinsicht zu dem Schluss, dass die Bewertungen personenbezogene Daten darstellen, die den datenschutzrechtlichen Bestimmungen unterlie-gen und eine Veröffentlichung der Bewertungen nach dem im deutschen Recht geltenden da-tenschutzrechtlichen Erlaubnisvorbehalt grundsätzlich nicht in Betracht kommt. Vor dem Hintergrund dieser den tatsächlichen Gegebenheiten und Interessenlagen im Internet nicht mehr gerecht werdenden Gesetzeslage diskutiert der Autor sodann die Frage, ob die datenschutzrechtlichen Bestimmungen in diesen Fällen eine Einschränkung durch die grund-gesetzlich garantierten Informationsfreiheiten erfahren müssen. Nach einer ausführlichen Diskussion der Rechtslage, in der auf die Besonderheiten der ein-zelnen Portale eingegangen wird, kommt die Arbeit zu dem Schluss, dass die Frage der Zuläs-sigkeit der Veröffentlichung der Bewertungen von einer Interessenabwägung im Einzelfall abhängt. Als Grundsatz kann jedoch gelten: Ist die bewertete Tätigkeit oder Person in Bezug auf die bewertete Eigenschaft ohnehin einer breiten Öffentlichkeit zugänglich, erscheint eine Veröffentlichung der Daten nicht bedenklich. Dagegen wird man einen Löschungs- bzw. Be-seitigungsanspruch bejahen müssen für die Bewertungen, die Tätigkeiten oder Eigenschaften des Bewerteten, die in keinem Zusammenhang mit ihm als öffentlicher Person stehen, betref-fen. Anschließend geht die Arbeit auf die Persönlichkeitsrechte der Betroffenen und der sich hier-aus ergebenden Beseitigungs- und Unterlassungsansprüchen gemäß der §§ 1004 Abs. 1, 823 Abs. 1 BGB ein, verneint jedoch wegen dem Vorrang der spezialgesetzlichen Bestimmungen aus dem Bundesdatenschutzgesetz letztlich eine Anwendbarkeit der Anspruchsgrundlagen. Schließlich wird in diesem Teil noch kurz auf die Zulässigkeit der Bewertung juristischer Per-sonen eingegangen, die im Grundsatz bejaht wird. Dritter Teil Sofern der zweite Teil der Arbeit zu dem Schluss kommt, dass die Veröffentlichung der Be-wertungen zulässig ist, stellt sich im dritten Teil die Frage, welche Möglichkeiten das Recht dem Bewerteten bietet, gegen negative Bewertungen vorzugehen. Untersucht werden, dabei datenschutzrechtliche, deliktsrechtliche, vertragliche und wettbe-werbsrechtliche Ansprüche. Ein Schwerpunkt dieses Teils liegt in der Darstellung der aktuellen Rechtsprechung zu der Frage wann eine Bewertung eine Tatsachenbehauptung bzw. ein Werturteil darstellt und den sich hieraus ergebenden unterschiedlichen Konsequenzen für den Unterlassungsanspruch des Betroffenen. Diejenigen Bewertungen, die eine Meinungsäußerung darstellen, unterstehen dem starken Schutz der Meinungsäußerungsfreiheit. Grenze der Zulässigkeit sind hier im wesentlichen nur die Schmähkritik und Beleidigung. An Tatsachenbehautpungen dagegen sind schärfere Maßstäbe anzulegen. In diesem Zusammenhang wird der Frage nachgegangen, ob vertragliche Beziehungen zwischen den Beteiligten (Bewertenden, Bewertete und Portalbetreiber) die Meinungsäußerungsfreiheit einschränkt, was jedenfalls für die geschlossenen Portale bejaht wird. Vierter Teil Der vierte Teil der Arbeit beschäftigt sich mit den „Zu-gut-Bewertungen“. Es geht dabei um wettbewerbsrechtliche Ansprüche im Falle verdeckter Eigenbewertungen. Solche Eigenbewertungen, die unter dem Deckmantel der Pseudonymität als Werbemittel zur Imageverbesserung in entsprechenden Bewertungsportale verbreitet werden ohne den wahren Autor erkennen zu lassen, sind in wettbewerbsrechtlicher Hinsicht grundsätzlich unzulässig. Fünfter Teil Im letzten Teil der Arbeit wird schließlich der Frage nach der Verantwortlichkeit der Portal-betreiber für rechtswidrige Bewertungen nachgegangen. Zunächst wird die Feststellung getroffen, dass es sich bei den von den Nutzern abgegebenen Bewertungen um fremde Inhalte handelt und somit die Haftungsprivilegierungen der § 11 Abs. 1 TDG, § 9 Abs. 1 MDStV eingreifen, wonach die Forenbetreiber für die rechtswidrigen Bewertungen jedenfalls so lange nicht verantwortlich sind, solange sie hiervon keine Kenntnis haben. Da von dieser Haftungsprivilegierung nach der Rechtsprechung des Bundesgerichtshofs die Störerhaftung nicht umfasst ist, wird die Reichweite die Forenbetreiber aus der Störerhaftung treffenden Überwachungspflichten diskutiert. Die Arbeit kommt hier zu dem Ergebnis, dass in den Fällen, in denen dem Adressaten der Bewertung die Identität des Verfassers bekannt ist, sich die Verpflichtungen der Forenbetrei-ber auf die Beseitigung bzw. Sperrung der rechtswidrigen Bewertung beschränken. Sofern die Identität des Bewertenden unbekannt ist, haften die Forenbetreiber als Mitstörer und dem Be-troffenen stehen Unterlassungsansprüche auch gegen die Forenbetreiber zu.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Heavy metals are present in industrial waste. These metals can generate a large environmental impact contaminating water, soil and plants. The chemical action of heavy metals has attracted environmental interest. In this context, this study aimed to test t he performance of electrochemical technologies for removing and quantifying heavy metals. First ly , the electroanalytical technique of stripping voltammetry with glassy carbon electrode (GC) was standardized in order to use this method for the quantificatio n of metals during their removal by electrocoagulation process (EC). A nalytical curves were evaluated to obtain reliability of the determin ation and quantification of Cd 2+ and Pb 2+ separately or in a mixture. Meanwhile , EC process was developed using an el ectrochemical cell in a continuous flow (EFC) for removing Pb 2+ and Cd 2+ . The se experiments were performed using Al parallel plates with 10 cm of diameter (  63.5 cm 2 ) . The optimization of conditions for removing Pb 2+ and Cd 2+ , dissolved in 2 L of solution at 151 L h - 1 , were studied by applying different values of current for 30 min. Cd 2+ and Pb 2+ concentrations were monitored during electrolysis using stripping voltammetry. The results showed that the removal of Pb 2 + was effective when the EC pro cess is used, obtaining removals of 98% in 30 min. This behavior is dependent on the applied current, which implies an increase in power consumption. From the results also verified that the stripping voltammetry technique is quite reliable deter mining Pb 2+ concentration , when compared with the measurements obtained by atomic absorption method (AA). In view of this, t he second objective of this study was to evaluate the removal of Cd 2+ and Pb 2+ (mixture solution) by EC . Removal efficiency increasing current was confirmed when 93% and 100% of Cd 2+ and Pb 2+ was removed after 30 min . The increase in the current promotes the oxidation of sacrificial electrodes, and consequently increased amount of coagulant, which influences the removal of heavy metals in solution. Adsortive voltammetry is a fast, reliable, economical and simple way to determine Cd 2+ and Pb 2+ during their removal. I t is more economical than those normally used, which require the use of toxic and expensive reagents. Our results demonstrated the potential use of electroanalytical techniques to monitor the course of environmental interventions. Thus, the application of the two techniques associated can be a reliable way to monitor environmental impacts due to the pollution of aquatic ecosystems by heavy metals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A type of macro drainage solution widely used in urban areas with predomi-nance of closed catchments (basins without outlet) is the implementation of detention and infiltration reservoirs (DIR). This type of solution has the main function of storing surface runoff and to promote soil infiltration and, consequently, aquifer recharge. The practice is to avoid floods in the drainage basin low-lying areas. The catchment waterproofing reduces the distributed groundwater recharge in urban areas, as is the case of Natal city, RN. However, the advantage of DIR is to concentrate the runoff and to promote aquifer recharge to an amount that can surpass the distributed natu-ral recharge. In this paper, we proposed studying a small urban drainage catchment, named Experimental Mirassol Watershed (EMW) in Natal, RN, whose outlet is a DIR. The rainfall-runoff transformation processes, water accumulation in DIR and the pro-cess of infiltration and percolation in the soil profile until the free aquifer were mod-eled and, from rainfall event observations, water levels in DIR and free aquifer water level measurements, and also, parameter values determination, it is was enabled to calibrate and modeling these combined processes. The mathematical modeling was carried out from two numerical models. We used the rainfall-runoff model developed by RIGHETTO (2014), and besides, we developed a one-dimensional model to simu-late the soil infiltration, percolation, redistribution soil water and groundwater in a combined system to the reservoir water balance. Continuous simulation was run over a period of eighteen months in time intervals of one minute. The drainage basin was discretized in blocks units as well as street reaches and the soil profile in vertical cells of 2 cm deep to a total depth of 30 m. The generated hydrographs were transformed into inlet volumes to the DIR and then, it was carried out water balance in these time intervals, considering infiltration and percolation of water in the soil profile. As a re-sult, we get to evaluate the storage water process in DIR as well as the infiltration of water, redistribution into the soil and the groundwater aquifer recharge, in continuous temporal simulation. We found that the DIR has good performance to storage excess water drainage and to contribute to the local aquifer recharge process (Aquifer Dunas / Barreiras).