936 resultados para Stirling engines.
Resumo:
NDIR is proposed for monitoring of air pollutants emitted by ship engines. Careful optical filtering overcomes the challenge of optical detection of NO2 in humid exhaust gas, despite spectroscopic overlap with the water vapour band.
Resumo:
The present systematic review was performed to assess consumer purchasing behaviour towards fish and seafood products in the wide context of developed countries. Web of Science, Scopus, ScienceDirect and Google Scholar engines were used to search the existing literature and a total of 49 studies were identified for inclusion. These studies investigated consumer purchasing behaviour towards a variety of fish and seafood products, in different countries and by means of different methodological approaches. In particular, the review identifies and discusses the main drivers and barriers of fish consumption as well as consumers’ preferences about the most relevant attributes of fish and seafood products providing useful insights for both practitioners and policy makers. Finally, main gaps of the existing literature and possible trajectories for future research are also discussed.
Resumo:
Newborn and Hyatt recently tested the chess-engine CRAFTY on two test sets of 16 positions each. The performance of an engine making greater use of endgame tables (EGTs) is compared with that of CRAFTY. Reference is made to three other articles in which composed positions have been used to test a variety of chess engines. Supporting data here comprises two pgn files, an annotated-pgn file and the original data worksheets used.
Resumo:
This article is concerned with the risks associated with the monopolisation of information that is available from a single source only. Although there is a longstanding consensus that sole-source databases should not receive protection under the EU Database Directive, and there are legislative provisions to ensure that lawful users have access to a database’s contents, Ryanair v PR Aviation challenges this assumption by affirming that the use of non-protected databases can be restricted by contract. Owners of non-protected databases can contractually exclude lawful users from taking the benefit of statutorily permitted uses, because such databases are not covered from the legislation that declares this kind of contract null and void. We argue that this judgment is not consistent with the legislative history and can have a profound impact on the functioning of the digital single market, where new information services, such as meta-search engines or price-comparison websites, base their operation on the systematic extraction and re-utilisation of materials available from online sources. This is an issue that the Commission should address in a forthcoming evaluation of the Database Directive.
Resumo:
This article is concerned with the liability of search engines for algorithmically produced search suggestions, such as through Google’s ‘autocomplete’ function. Liability in this context may arise when automatically generated associations have an offensive or defamatory meaning, or may even induce infringement of intellectual property rights. The increasing number of cases that have been brought before courts all over the world puts forward questions on the conflict of fundamental freedoms of speech and access to information on the one hand, and personality rights of individuals— under a broader right of informational self-determination—on the other. In the light of the recent judgment of the Court of Justice of the European Union (EU) in Google Spain v AEPD, this article concludes that many requests for removal of suggestions including private individuals’ information will be successful on the basis of EU data protection law, even absent prejudice to the person concerned.
Resumo:
This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.
Resumo:
Emission of fine particles by mobile sources has been a matter of great concern due to its potential risk both to human health and the environment. Although there is no evidence that one sole component may be responsible for the adverse health outcomes, it is postulated that the metal particle content is one of the most important factors, mainly in relation to oxidative stress. Data concerning the amount and type of metal particles emitted by automotive vehicles using Brazilian fuels are limited. The aim of this study was to identify inhalable particles (PM10) and their trace metal content in two light-duty vehicles where one was fueled with ethanol while the other was fueled with gasoline mixed with 22% of anhydrous ethanol (gasohol); these engines were tested on a chassis dynamometer. The elementary composition of the samples was evaluated by the particle-induced x-ray emission technique. The experiment showed that total emission factors ranged from 2.5 to 11.8 mg/km in the gasohol vehicle, and from 1.2 to 3 mg/km in the ethanol vehicle. The majority of particles emitted were in the fine fraction (PM2.5), in which Al, Si, Ca, and Fe corresponded to 80% of the total weight. PM10 emissions from the ethanol vehicle were about threefold lower than those of gasohol. The elevated amount of fine particulate matter is an aggravating factor, considering that these particles, and consequently associated metals, readily penetrate deeply into the respiratory tract, producing damage to lungs and other tissues.
Resumo:
The endosperm of seeds of Sesbania virgata (Cav.) Pers. accumulates galactomannan as a cell wall storage polysaccharide. It is hydrolysed by three enzymes, one of them being alpha-galactosidase. A great amount of protein bodies is found in the cytoplasm of endospermic cells, which are thought to play the major role as a nitrogen reserve in this seed. The present work aimed at understanding how the production of enzymes that degrade storage compounds is controlled. We performed experiments with addition of inhibitors of transcription (actinomycin-d and alpha-amanitin) and translation (cycloheximide) during and after germination. In order to follow the performance of storage mobilisation, we measured fresh mass, protein contents and alpha-galactosidase activity. All the inhibitors tested had little effect on seed germination and seedling development. Actinomycin-d and cycloheximide provoked a slight inhibition of the storage protein degradation and concomitantly lead to an elevation of the alpha-galactosidase activity. Although alpha-amanitin showed some effect on seedling development at latter stages, it presented the former effect and did not change galactomannan degradation performance. Our data suggest that some of the proteases may be synthesised de novo, whereas alpha-galactosidase seems to be present in the endosperm cells probably as an inactive polypeptide in the protein bodies, being probably activated by proteolysis when the latter organelle is disassembled. These evidences suggest the existence of a connection between storage proteins and carbohydrates mobilisation in seeds of S. virgata, which would play a role by assuring a balanced afflux of the carbon and nitrogen to the seedling development.
Resumo:
Brumadoite, ideally Cu(2)Te(6+)O(4)(OH)(4)center dot 5H(2)O, is a new mineral from Pedra Preta mine, Serra das Eguas, Brumado, Bahia, Brazil. It occurs as microcrystalline aggregates both on and, rarely, pseudomorphous after coarse-grained magnesite, associated with mottramite and quartz. Crystals are platy, subhedral, 1-2 mu m in size. Brumadoite is blue (near RHS 114B), has a pale blue streak and a vitreous lustre. It is transparent to translucent and does not fluoresce. The empirical formula is (Cu(2.90)Pb(0.04)Ca(0.01))(Sigma 2.95) (Te(0.93)(6+)Si(0.05))(Sigma 0.98)O(3.92)(OH)(3.84)center dot 5.24H(2)O. Infrared spectra clearly show both (OH) and H(2)O. Microchemical spot tests using a KI Solution show that brumadoite has tellurium in the 6(+) state. The mineral is monoclinic, P2(1)/m or P2(1). Unit-cell parameters refined from X-ray powder data are a 8.629(2) angstrom, b 5.805(2) angstrom, c 7.654(2) angstrom, beta 103.17(2)degrees, V 373.3(2) angstrom(3), Z = 2. The eight strongest X-ray powder-diffraction lines [d in angstrom, (l),(hkl)] are: 8.432,(100),(100); 3.162,(66),((2) over bar 02); 2.385,(27),(220); 2.291,((1) over bar 12),(22); 1.916,(11),(312); 1.666,(14),((4) over bar 22,114); 1.452,(10), (323, 040); 1.450,(10),(422,403). The name is for the type locality, Brumado, Bahia, Brazil. The new mineral species has been approved by the CNMNC (IMA 2008-028).
Resumo:
The diffusion of Concentrating Solar Power Systems (CSP) systems is currently taking place at a much slower pace than photovoltaic (PV) power systems. This is mainly because of the higher present cost of the solar thermal power plants, but also for the time that is needed in order to build them. Though economic attractiveness of different Concentrating technologies varies, still PV power dominates the market. The price of CSP is expected to drop significantly in the near future and wide spread installation of them will follow. The main aim of this project is the creation of different relevant case studies on solar thermal power generation and a comparison betwwen them. The purpose of this detailed comparison is the techno-economic appraisal of a number of CSP systems and the understanding of their behaviour under various boundary conditions. The CSP technologies which will be examined are the Parabolic Trough, the Molten Salt Power Tower, the Linear Fresnel Mirrors and the Dish Stirling. These systems will be appropriatly sized and simulated. All of the simulations aim in the optimization of the particular system. This includes two main issues. The first is the achievement of the lowest possible levelized cost of electricity and the second is the maximization of the annual energy output (kWh). The project also aims in the specification of these factors which affect more the results and more specifically, in what they contribute to the cost reduction or the power generation. Also, photovoltaic systems will be simulated under same boundary conditions to facolitate a comparison between the PV and the CSP systems. Last but not leats, there will be a determination of the system which performs better in each case study.
Resumo:
Background: Voice processing in real-time is challenging. A drawback of previous work for Hypokinetic Dysarthria (HKD) recognition is the requirement of controlled settings in a laboratory environment. A personal digital assistant (PDA) has been developed for home assessment of PD patients. The PDA offers sound processing capabilities, which allow for developing a module for recognition and quantification HKD. Objective: To compose an algorithm for assessment of PD speech severity in the home environment based on a review synthesis. Methods: A two-tier review methodology is utilized. The first tier focuses on real-time problems in speech detection. In the second tier, acoustics features that are robust to medication changes in Levodopa-responsive patients are investigated for HKD recognition. Keywords such as Hypokinetic Dysarthria , and Speech recognition in real time were used in the search engines. IEEE explorer produced the most useful search hits as compared to Google Scholar, ELIN, EBRARY, PubMed and LIBRIS. Results: Vowel and consonant formants are the most relevant acoustic parameters to reflect PD medication changes. Since relevant speech segments (consonants and vowels) contains minority of speech energy, intelligibility can be improved by amplifying the voice signal using amplitude compression. Pause detection and peak to average power rate calculations for voice segmentation produce rich voice features in real time. Enhancements in voice segmentation can be done by inducing Zero-Crossing rate (ZCR). Consonants have high ZCR whereas vowels have low ZCR. Wavelet transform is found promising for voice analysis since it quantizes non-stationary voice signals over time-series using scale and translation parameters. In this way voice intelligibility in the waveforms can be analyzed in each time frame. Conclusions: This review evaluated HKD recognition algorithms to develop a tool for PD speech home-assessment using modern mobile technology. An algorithm that tackles realtime constraints in HKD recognition based on the review synthesis is proposed. We suggest that speech features may be further processed using wavelet transforms and used with a neural network for detection and quantification of speech anomalies related to PD. Based on this model, patients' speech can be automatically categorized according to UPDRS speech ratings.
Resumo:
Each year search engines like Google, Bing and Yahoo, complete trillions of search queries online. Students are especially dependent on these search tools because of their popularity, convenience and accessibility. However, what students are unaware of, by choice or naiveté is the amount of personal information that is collected during each search session, how that data is used and who is interested in their online behavior profile. Privacy policies are frequently updated in favor of the search companies but are lengthy and often are perused briefly or ignored entirely with little thought about how personal web habits are being exploited for analytics and marketing. As an Information Literacy instructor, and a member of the Electronic Frontier Foundation, I believe in the importance of educating college students and web users in general that they have a right to privacy online. Class discussions on the topic of web privacy have yielded an interesting perspective on internet search usage. Students are unaware of how their online behavior is recorded and have consistently expressed their hesitancy to use tools that disguise or delete their IP address because of the stigma that it may imply they have something to hide or are engaging in illegal activity. Additionally, students fear they will have to surrender the convenience of uber connectivity in their applications to maintain their privacy. The purpose of this lightning presentation is to provide educators with a lesson plan highlighting and simplifying the privacy terms for the three major search engines, Google, Bing and Yahoo. This presentation focuses on what data these search engines collect about users, how that data is used and alternative search solutions, like DuckDuckGo, for increased privacy. Students will directly benefit from this lesson because informed internet users can protect their data, feel safer online and become more effective web searchers.
Resumo:
O racional teórico das finanças comportamentais se sustenta em dois grandes pilares: limites de arbitragem e irracionalidade dos investidores. Dentre os desvios de racionalidade conhecidos, um foi de particular interesse para este estudo: o viés da disponibilidade. Este viés acontece nas situações em que as pessoas estimam a frequência de uma classe ou a probabilidade de um evento pela facilidade com que instâncias ou ocorrências podem ser lembradas. O advento da internet permitiu a verificação do viés de disponibilidade em larga escala por meio da análise dos dados de buscas realizadas. I.e., se uma determinada ação é mais procurada que outras, podemos inferir que ela está mais disponível na memória coletiva dos investidores. Por outro lado, a literatura das finanças comportamentais tem um braço mais pragmático, que estuda estratégias capazes de fornecer retornos anormais, acima do esperado pela hipótese do mercado eficiente. Para os fins deste estudo, destaca-se o efeito momento, no qual o grupo de ações de melhor resultado nos últimos J meses tende a fornecer melhores resultados pelos próximos K meses. O propósito deste estudo foi verificar a possibilidade de se obter retornos acima dos identificados pelo efeito momento segmentando-se as carteiras de maior e menor viés de disponibilidade. Os resultados obtidos foram positivos e estatisticamente significativos na amostra selecionada. A estratégia cruzada entre efeito momento e disponibilidade produziu, para J=6 e K=6, retornos médios mensais de 2,82% com estatística t de 3,14. Já a estratégia só de efeito momento, para o mesmo período de formação e prazo de manutenção, gerou retornos médios mensais de apenas 1,40% com estatística t de 1,22.
Resumo:
Com o rápido crescimento da Internet, a quantidade de informações disponíveis tornou-se gigantesca, dificultando a busca pelo que realmente é essencial. Da mesma forma que vários esforços na divulgação de informações relevantes foram efetuados, sites de qualidade duvidosa também apareceram. Os "Sites de busca, apesar de ainda limitados, passaram a ter papel cada vez mais importante, e alguns deles são, hoje, líderes em número de acessos. A avaliação da qualidade de sites na Internet é, hoje, tópico de estudo de vários pesquisadores em todo o mundo. Vários modelos de avaliação existem, com focos que se alternam entre a avaliação dos aspectos tecnológicos e a medição da qualidade percebida pelo usuário ou consumidor. Nesta pesquisa, optou-se por analisar a aplicabilidade do instrumento "WebOual" - criado pelos professores Stuart Barnes (Victoria University of Wellington - Nova Zelândia) e Richard Vidgen (Universidade de Bath - Inglaterra) - o qual privilegia a avaliação da qualidade dos sites pesquisados pela ótica do usuário ou consumidor. Será aplicada a abordagem WebOual a três sites de busca internacionais que possuem versões localizadas em português. no Brasil: Altavista, Yahoo! e Google. Trata-se de utilização inédita da abordagem WebOual, anteriormente testada por seus criadores na avaliação de sites de Livrarias Virtuais, Universidades, Leilões ontine e sites WAP.