821 resultados para Electronic Gaming Machines
Resumo:
Electronic energy transfer (EET) rate constants between a naphthalene donor and anthracene acceptor in [ZnL4a](ClO4)(2) and [ZnL4b](ClO4)(2) were determined by time-resolved fluorescence where L-4a and L-4b are the trans and cis isomers of 6-((anthracen-9-yl-methyl)amino)-6,13-dimethyl-13-((naphthalen-1-yl-methyl)amino)-1,4,8,11-tetraazacyclotetradecane, respectively. These isomers differ in the relative disposition of the appended chromophores with respect to the macrocyclic plane. The trans isomer has an energy transfer rate constant (k(EET)) of 8.7 x 10(8) s(-1), whereas that of the cis isomer is significantly faster (2.3 x 10(9) s(-1)). Molecular modeling was used to determine the likely distribution of conformations in CH3CN solution for these complexes in an attempt to identify any distance or orientation dependency that may account for the differing rate constants observed. The calculated conformational distributions together with analysis by H-1 NMR for the [ZnL4a](2+) trans complex in the common trans-III N-based isomer gave a calculated Forster rate constant close to that observed experimentally. For the [ZnL4b](2+) cis complex, the experimentally determined rate constant may be attributed to a combination of trans-Ill and trans-I N-based isomeric forms of the complex in solution.
Resumo:
Although postal questionnaires, personal interviewing, and telephone interviewing are the main methods of survey-based research, there is an increasing use of e-mail as a data collection medium. However, little, if any, published Western research in general and that of Turkish in particular have investigated e-mail survey technique from pure survey research perspective. Attempting to develop a framework to assess e-mail as a data collection mean, the purpose of this study is to explore e-mail-based questionnaire technique from complementary angles. To this goal, sample representativeness, data quality, response rates, and advantages and disadvantages of e-mail surveying are discussed.
Resumo:
The State Reform processes combined with the emergence and use of Information and Communication Technology (ICT) originated electronic government policies and initiatives in Brazil. This paper dwells on Brazilian e-government by investigating the institutional design it assumed in the state's public sphere, and how it contributed to outcomes related to e-gov possibilities. The analyses were carried out under an interpretativist perspective by making use of Institutional Theory. From the analyses of interviews with relevant actors in the public sphere, such as state secretaries and presidents of public ICT companies, conclusions point towards low institutionalization of e-gov policies. The institutional design of Brazilian e-gov limits the use of ICT to provide integrated public services, to amplify participation and transparency, and to improve public policies management.
Resumo:
Many organisations make extensive use of electronic linkages to facilitate their trading exchanges with partners such as suppliers, distributors and customers. This research explores how the use of inter-organisational systems (IOS) both affects, and is affected by, the relationships between trading partners. In doing this, it brings together two existing but distinct perspectives and literatures; the rational view informed by IOS research, and the behavioural or relationship perspective embodied in inter-organisational relationships (IOR) literature. The research was undertaken in the European paper industry by means of six dyadic case studies. The dyads studied covered both traditional electronic data interchange systems and newer e-marketplace environments. A framework was derived from existing literature that integrates the two perspectives of interest. The framework was used to analyse the case studies undertaken and enabled the inter-relationship between IOS use and IOR to be explained.
Resumo:
This paper presents a Multi-Agent Market simulator designed for developing new agent market strategies based on a complete understanding of buyer and seller behaviors, preference models and pricing algorithms, considering user risk preferences and game theory for scenario analysis. This tool studies negotiations based on different market mechanisms and, time and behavior dependent strategies. The results of the negotiations between agents are analyzed by data mining algorithms in order to extract rules that give agents feedback to improve their strategies. The system also includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions, and capable of considering other agent reactions.
Resumo:
This paper presents a Multi-Agent Market simulator designed for analyzing agent market strategies based on a complete understanding of buyer and seller behaviors, preference models and pricing algorithms, considering user risk preferences and game theory for scenario analysis. The system includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions, and capable of considering other agents reactions.
Resumo:
This paper presents an agent-based simulator designed for analyzing agent market strategies based on a complete understanding of buyer and seller behaviours, preference models and pricing algorithms, considering user risk preferences. The system includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions. In the simulated market agents interact in several different ways and may joint together to form coalitions. In this paper we address multi-agent coalitions to analyse Distributed Generation in Electricity Markets
Resumo:
With the increasing importance of large commerce across the Internet it is becoming increasingly evident that in a few years the Iternet will host a large number of interacting software agents. a vast number of them will be economically motivated, and will negociate a variety of goods and services. It is therefore important to consider the economic incentives and behaviours of economic software agents, and to use all available means to anticipate their collective interactions. This papers addresses this concern by presenting a multi-agent market simulator designed for analysing agent market strategies based on a complete understanding of buyer and seller behaviours, preference models and pricing algorithms, consideting risk preferences. The system includes agents that are capable of increasing their performance with their own experience, by adapting to the market conditions. The results of the negotiations between agents are analysed by data minig algorithms in order to extract rules that give agents feedback to imprive their strategies.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Trabalho Final de Mestrado para a obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
In studies assessing the effects of a given exposure variable and a specific outcome of interest, confusion may arise from the mistaken impression that the exposure variable is producing the outcome of interest, when in fact the observed effect is due to an existing confounder. However, quantitative techniques are rarely used to determine the potential influence of unmeasured confounders. Sensitivity analysis is a statistical technique that allows to quantitatively measuring the impact of an unmeasured confounding variable on the association of interest that is being assessed. The purpose of this study was to make it feasible to apply two sensitivity analysis methods available in the literature, developed by Rosenbaum and Greenland, using an electronic spreadsheet. Thus, it can be easier for researchers to include this quantitative tool in the set of procedures that have been commonly used in the stage of result validation.
Resumo:
Cada vez mais a indústria tem vindo a sofrer algumas mudanças no seu processo produtivo. Hoje, mais que nunca, é preciso garantir que as instalações produtivas sejam o mais eficiente possível, procurando a racionalização de energia com um decrescimento dos custos. Deste modo o objectivo desta dissertação é o diagnóstico energético da fábrica de placas de borracha e a optimização do sector da pintura na empresa Monteiro Ribas. A realização de um diagnóstico energético, para a detecção de desperdícios de energia tem sido amplamente utilizada. A optimização irá prospectar potenciais de mudanças e aplicação de tecnologias de eficiência energética. Pretende-se deste modo travar o consumo energético sem que seja afectada a produção, já que a empresa é considerada consumidora intensiva de energia. Na empresa Monteiro Ribas há consumo de gás natural, de vapor e de energia eléctrica, sendo o vapor a forma de energia mais consumida, seguida da energia eléctrica e por fim, do gás natural nas proporções de 55%, 41% e 4%, respectivamente. A optimização feita permitiu estudar a influência de algumas variáveis, nos consumos anuais da energia, e assim apresentar propostas de melhoria. Uma das propostas analisadas foi a possibilidade de efectuar um isolamento térmico a algumas válvulas. Este isolamento conduziria a uma poupança de 79.263,4 kWh/ano. Propôs-se também a implementação de balastros electrónicos, que conduziria a uma diminuição em energia eléctrica de 29.509,92 kWh/ano. Relativamente às máquinas utilizadas no sector da pintura, verificou-se ser a estufa IRK 6, um dos equipamentos de grande consumo energético. Então analisou-se a influência da velocidade de circulação das placas de borracha através desta máquina, bem como a alteração da respectiva potência, pela diminuição do número de cassetes incorporados nesta estufa.
Oxidative Leaching of metals from electronic waste with solutions based on quaternary ammonium salts
Resumo:
The treatment of electric and electronic waste (WEEE) is a problem which receives ever more attention. An inadequate treatment results in harmful products ending up in the environment. This project intends to investigate the possibilities of an alternative route for recycling of metals from printed circuit boards (PCBs) obtained from rejected computers. The process is based on aqueous solutions composed of an etchant, either 0.2 M CuCl2.2H2O or 0.2 M FeCl3.6H2O, and a quaternary ammonium salt (quat) such as choline chloride or chlormequat. These solutions are reminiscent of deep eutectic solvents (DES) based on quats. DES are quite similar to ionic liquids (ILs) and are used as well as alternative solvents with a great diversity of physical properties, making them attractive for replacement of hazardous, volatile solvents (e.g. VOCs). A remarkable difference between genuine DES and ILs with the solutions used in this project is the addition of rather large quantities of water. It is shown the presence of water has a lot of advantages on the leaching of metals, while the properties typical for DES still remain. The oxidizing capacities of Cu(II) stem from the existence of a stable Cu(I) component in quat based DES and thus the leaching stems from the activity of the Cu(II)/Cu(I) redox couple. The advantage of Fe(III) in combination with DES is the fact that the Fe(III)/Fe(II) redox couple becomes reversible, which is not true in pure water. This opens perspectives for regeneration of the etching solution. In this project the leaching of copper was studied as a function of gradual increasing water content from 0 - 100w% with the same concentration of copper chloride or iron(III) chloride at room temperature and 80ºC. The solutions were also tested on real PCBs. At room temperature a maximum leaching effect for copper was obtained with 30w% choline chloride with 0.2 M CuCl2.2H2O. The leaching effect is still stronger at 80°C, b ut of course these solutions are more energy consuming. For aluminium, tin, zinc and lead, the leaching was faster at 80ºC. Iron and nickel dissolved easily at room temperature. The solutions were not able to dissolve gold, silver, rhodium and platinum.
Resumo:
The potential of the electrocardiographic (ECG) signal as a biometric trait has been ascertained in the literature over the past decade. The inherent characteristics of the ECG make it an interesting biometric modality, given its universality, intrinsic aliveness detection, continuous availability, and inbuilt hidden nature. These properties enable the development of novel applications, where non-intrusive and continuous authentication are critical factors. Examples include, among others, electronic trading platforms, the gaming industry, and the auto industry, in particular for car sharing programs and fleet management solutions. However, there are still some challenges to overcome in order to make the ECG a widely accepted biometric. In particular, the questions of uniqueness (inter-subject variability) and permanence over time (intra-subject variability) are still largely unanswered. In this paper we focus on the uniqueness question, presenting a preliminary study of our biometric recognition system, testing it on a database encompassing 618 subjects. We also performed tests with subsets of this population. The results reinforce that the ECG is a viable trait for biometrics, having obtained an Equal Error Rate of 9.01% and an Error of Identification of 15.64% for the entire test population.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica