906 resultados para Counter-trading


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Author presents a synopsis about the post-Paleozoic igneous activity in continental Portugal. Subvolcanic massifs of Sintra, Sines and Monchique and the basaltic complex of Lisbon-Mafra are interpreted. The large network of dikes and sills occuring at north of Tagus river in Lisbon- Torres Vedras region as the dikes of Algarve and also those of diapiric formation are studied and compared. Also the doleritic dikes cuting the Hesperic Massif and the Great dike of Alentejo are studied. The Author presents an attempt of petrological and geochemical correlation-among these post-Paleozoic igneous rocks. For this more than 350 chemical analysis are used in order to elaborate several diagrams and some general conclusions are derived from them. The correlation between the origin of these igneous rocks and the opening of North Atlantic and the counter-clockwise rotation of the Iberia are also tried.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is widely assumed that scheduling real-time tasks becomes more difficult as their deadlines get shorter. With deadlines shorter, however, tasks potentially compete less with each other for processors, and this could produce more contention-free slots at which the number of competing tasks is smaller than or equal to the number of available processors. This paper presents a policy (called CF policy) that utilizes such contention-free slots effectively. This policy can be employed by any work-conserving, preemptive scheduling algorithm, and we show that any algorithm extended with this policy dominates the original algorithm in terms of schedulability. We also present improved schedulability tests for algorithms that employ this policy, based on the observation that interference from tasks is reduced when their executions are postponed to contention-free slots. Finally, using the properties of the CF policy, we derive a counter-intuitive claim that shortening of task deadlines can help improve schedulability of task systems. We present heuristics that effectively reduce task deadlines for better scheduability without performing any exhaustive search.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Gestão Estratégica das Relações Públicas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the growing complexity and adaptability requirements of real-time embedded systems, which often exhibit unrestricted inter-dependencies among supported services and user-imposed quality constraints, it is increasingly difficult to optimise the level of service of a dynamic task set within an useful and bounded time. This is even more difficult when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand. This paper proposes an iterative refinement approach for a service’s QoS configuration taking into account services’ inter-dependencies and quality constraints, and trading off the achieved solution’s quality for the cost of computation. Extensive simulations demonstrate that the proposed anytime algorithm is able to quickly find a good initial solution and effectively optimises the rate at which the quality of the current solution improves as the algorithm is given more time to run. The added benefits of the proposed approach clearly surpass its reducedoverhead.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se a guerra é a continuação da política por outros meios, então certamente que nela a comunicação, em sentido lato, esteve sempre presente e desempenhou um papel chave. As guerras não implicam apenas a violência, mas também a persuasão, a contrainformação, o convencimento e o combate ideológico. Nas guerras modernas, os media têm sido um elemento fundamental para mobilizar nações moldadas por dinâmicas de desenraizamento e desterritorialização com vista a um esforço conjunto de apoio popular à ação bélica do Estado. Este artigo incide nas transformações que a própria teoria e investigação em comunicação e media sofreram no período entre as duas Guerras Mundiais do século XX. Trata-se de um contexto histórico decisivo para compreender como a institucionalização do campo da comunicação num país central como os EUA ocorreu em condições sociais e políticas que contribuíram para o seu perfil epistemológico, posições teóricas e para a configuração do poder no próprio sistema científico universitário, cujas repercussões continuam a fazer-se sentir de diversas e complexas formas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The potential of the electrocardiographic (ECG) signal as a biometric trait has been ascertained in the literature over the past decade. The inherent characteristics of the ECG make it an interesting biometric modality, given its universality, intrinsic aliveness detection, continuous availability, and inbuilt hidden nature. These properties enable the development of novel applications, where non-intrusive and continuous authentication are critical factors. Examples include, among others, electronic trading platforms, the gaming industry, and the auto industry, in particular for car sharing programs and fleet management solutions. However, there are still some challenges to overcome in order to make the ECG a widely accepted biometric. In particular, the questions of uniqueness (inter-subject variability) and permanence over time (intra-subject variability) are still largely unanswered. In this paper we focus on the uniqueness question, presenting a preliminary study of our biometric recognition system, testing it on a database encompassing 618 subjects. We also performed tests with subsets of this population. The results reinforce that the ECG is a viable trait for biometrics, having obtained an Equal Error Rate of 9.01% and an Error of Identification of 15.64% for the entire test population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dye-sensitized solar cell (DSSC) is a promising solution to global energy and environmental problems because of its clean, low-cost, high efficiency, good durability, and easy fabrication. However, enhancing the efficiency of the DSSC still is an important issue. Here we devise a bifacial DSSC based on a transparent polyaniline (PANI) counter electrode (CE). Owing to the sunlight irradiation simultaneously from the front and the rear sides, more dye molecules are excited and more carriers are generated, which results in the enhancement of short-circuit current density and therefore overall conversion efficiency. The photoelectric properties of PANI can be improved by modifying with 4-aminothiophenol (4-ATP). The bifacial DSSC with 4-ATP/PANI CE achieves a light-to-electric energy conversion efficiency of 8.35%, which is increased by ,24.6% compared to the DSSC irradiated from the front only. This new concept along with promising results provides a new approach for enhancing the photovoltaic performances of solar cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia do Ambiente, Perfil Gestão e Sistemas Ambientais

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper characterizes work accidents at Portuguese industrial cleaning companies, operating in the service sector, through the application of ESAW methodology. Data was codified based on the analysis of 748 accident claims to insurance companies (number of days lost 1 working day) in 3 large industrial cleaning companies for the period 2001-2003. Slipping and falling in the same level was the main deviation from the normal working process in the moment of the accident (in 25% of the accidents); uncoordinated movements was the second cause of accidents (14%); falls of persons to a lower level was the third cause of accidents (~10%), including falls from stairs (~7%) and falls from ladders and mobile ladders (~2%); globally, body movement under or with physical stress, including lifting, carrying, putting down, bending down, twisting, turning, trading badly, twisting leg or ankle and slipping without falling, were the cause in 17% of the accidents. Lower limbs were injured in ~25% of the accidents, hand and fingers in ~14%, the eye in ~4% and the back in ~9% of the accidents. An incidence rate of 3,580 accidents/100,000 employees was found to the sector (2003 data).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a methodology to establish investment and trading strategies of a power generation company. These strategies are integrated in the ITEM-Game simulator in order to test their results when played against defined strategies used by other players. The developed strategies are focused on investment decisions, although trading strategies are also implemented to obtain base case results. Two cases are studied considering three players with the same trading strategy. In case 1, all players also have the same investment strategy driven by a market target share. In case 2, player 1 has an improved investment strategy with a target share twice of the target of players 2 and 3. Results put in evidence the influence of the CO2 and fuel prices in the company investment decision. It is also observed the influence of the budget constraint which might prevent the player to take the desired investment decision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the last years the electricity industry has faced a restructuring process. Among the aims of this process was the increase in competition, especially in the generation activity where firms would have an incentive to become more efficient. However, the competitive behavior of generating firms might jeopardize the expected benefits of the electricity industry liberalization. The present paper proposes a conjectural variations model to study the competitive behavior of generating firms acting in liberalized electricity markets. The model computes a parameter that represents the degree of competition of each generating firm in each trading period. In this regard, the proposed model provides a powerful methodology for regulatory and competition authorities to monitor the competitive behavior of generating firms. As an application of the model, a study of the day-ahead Iberian electricity market (MIBEL) was conducted to analyze the impact of the integration of the Portuguese and Spanish electricity markets on the behavior of generating firms taking into account the hourly results of the months of June and July of 2007. The advantages of the proposed methodology over other methodologies used to address market power, namely Residual Supply index and Lerner index are highlighted. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional vertically integrated power utilities around the world have evolved from monopoly structures to open markets that promote competition among suppliers and provide consumers with a choice of services. Market forces drive the price of electricity and reduce the net cost through increased competition. Electricity can be traded in both organized markets or using forward bilateral contracts. This article focuses on bilateral contracts and describes some important features of an agent-based system for bilateral trading in competitive markets. Special attention is devoted to the negotiation process, demand response in bilateral contracting, and risk management. The article also presents a case study on forward bilateral contracting: a retailer agent and a customer agent negotiate a 24h-rate tariff. © 2014 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Janssen-Cilag proposal for a risk-sharing agreement regarding bortezomib received a welcome signal from NICE. The Office of Fair Trading report included risk-sharing agreements as an available tool for the National Health Service. Nonetheless, recent discussions have somewhat neglected the economic fundamentals underlying risk-sharing agreements. We argue here that risk-sharing agreements, although attractive due to the principle of paying by results, also entail risks. Too many patients may be put under treatment even with a low success probability. Prices are likely to be adjusted upward, in anticipation of future risk-sharing agreements between the pharmaceutical company and the third-party payer. An available instrument is a verification cost per patient treated, which allows obtaining the first-best allocation of patients to the new treatment, under the risk sharing agreement. Overall, the welfare effects of risk-sharing agreements are ambiguous, and care must be taken with their use.