37 resultados para Single-crossing property
em Instituto Politécnico do Porto, Portugal
Resumo:
Radio Link Quality Estimation (LQE) is a fundamental building block for Wireless Sensor Networks, namely for a reliable deployment, resource management and routing. Existing LQEs (e.g. PRR, ETX, Fourbit, and LQI ) are based on a single link property, thus leading to inaccurate estimation. In this paper, we propose F-LQE, that estimates link quality on the basis of four link quality properties: packet delivery, asymmetry, stability, and channel quality. Each of these properties is defined in linguistic terms, the natural language of Fuzzy Logic. The overall quality of the link is specified as a fuzzy rule whose evaluation returns the membership of the link in the fuzzy subset of good links. Values of the membership function are smoothed using EWMA filter to improve stability. An extensive experimental analysis shows that F-LQE outperforms existing estimators.
Resumo:
The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.
Resumo:
In the last ten years, teen noir movies and series — such as Donnie Darko (2001), Brick (2005), or Veronica Mars (2004-2007) — have become increasingly popular among audiences, both in the USA and in Europe, and aroused the curiosity of critics. These teen noir adventures present darker themes and technical features that distinguish them from numerous productions aiming at young adults. Their narrative and aesthetic characteristics reinvent and subvert the tradition of classic noir movies of the forties and fifties, thus generating a sense of novelty. In this article, I focus my attention on Veronica Mars, a famous teen noir series, created by Rob Thomas, to examine: a) the teen noir themes; b) the new profile and role of the private investigator; c) the empowerment of girls/young women; d) razor-sharp dialogues; e) intertextual references to old- school noir movies. In order to do so, resort to the research of specialists in the field of neo noir, such as Mark Conrad, Foster Hirsch, or Roz Kaveney. My main goal is to prove that a new (sub)genre is slowly emerging and revivifying teen cinema.
Residential property loans and performance during property price booms: evidence from European banks
Resumo:
Understanding the performance of banks is of the utmost relevance, because of the impact of this sector on economic growth and financial stability. Of all the different assets that make up a bank portfolio, the residential mortgage loans constitute one of its main. Using the dynamic panel data method, we analyse the influence of residential mortgage loans on bank profitability and risk, using a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that banks with larger weights of residential mortgage loans show lower credit risk in good times. This result explains why banks rush to lend on property during booms due to the positive effects it has on credit risk. The results show further that credit risk and profitability are lower during the upturn in the residential property price cycle. The results also reveal the existence of a non-linear relationship (U-shaped marginal effect), as a function of bank’s risk, between profitability and the residential mortgage loans exposure. For those banks that have high credit risk, a large exposure of residential mortgage loans is associated with higher risk-adjusted profitability, through lower risk. For banks with a moderate/low credit risk, the effects of higher residential mortgage loan exposure on its risk-adjusted profitability are also positive or marginally positive.
Resumo:
Understanding the performance of banks is of the u tmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of performance. Using a dynamic panel model , we analyse the impact of res idential mortgage loans on bank profitability and risk , based on a sample of 555 banks in the European Union ( EU - 15 ) , over the period from 1995 to 2008. We find that banks with larger weight s in residential mortgage loans display lower credit risk in good market conditions . This result may explain why banks rush to lend on property during b ooms due to the positive effect it has on credit risk . The results also show that credit risk and profitability are lower during the upturn in the residential property cy cle. Furthermore, t he results reveal the existence of a non - linear relationship ( U - shaped marginal effect), as a function of bank’s risk, between profitability and residential mortgage exposure . For those banks that have high er credit risk, a large exposur e to residential loans is associated with increased risk - adjusted profitability, through a reduction in risk. For banks with a moderate to low credit risk, the impact of higher exposure are also positive on risk - adjusted profitability.
Resumo:
We report within this paper the development of a fiber-optic based sensor for Hg(II) ions. Fluorescent carbon nanoparticles were synthesized by laser ablation and functionalized with PEG200 and N-acetyl-l-cysteine so they can be anionic in nature. This characteristic facilitated their deposition by the layer-by-layer assembly method into thin alternating films along with a cationic polyelectrolyte, poly(ethyleneimine). Such films could be immobilized onto the tip of a glass optical fiber, allowing the construction of an optical fluorescence sensor. When immobilized on the fiber-optic tip, the resultant sensor was capable of selectively detecting sub-micromolar concentrations of Hg(II) with an increased sensitivity compared to carbon dot solutions. The fluorescence of the carbon dots was quenched by up to 44% by Hg(II) ions and interference from other metal ions was minimal.
Resumo:
We have developed a new method for single-drop microextraction (SDME) for the preconcentration of organochlorine pesticides (OCP) from complex matrices. It is based on the use of a silicone ring at the tip of the syringe. A 5 μL drop of n-hexane is applied to an aqueous extract containing the OCP and found to be adequate to preconcentrate the OCPs prior to analysis by GC in combination with tandem mass spectrometry. Fourteen OCP were determined using this technique in combination with programmable temperature vaporization. It is shown to have many advantages over traditional split/splitless injection. The effects of kind of organic solvent, exposure time, agitation and organic drop volume were optimized. Relative recoveries range from 59 to 117 %, with repeatabilities of <15 % (coefficient of variation) were achieved. The limits of detection range from 0.002 to 0.150 μg kg−1. The method was applied to the preconcentration of OCPs in fresh strawberry, strawberry jam, and soil.
Resumo:
As lectures, but above all, as Erasmus....
Resumo:
Food lipid major components are usually analyzed by individual methodologies using diverse extractive procedures for each class. A simple and fast extractive procedure was devised for the sequential analysis of vitamin E, cholesterol, fatty acids, and total fat estimation in seafood, reducing analyses time and organic solvent consumption. Several liquid/liquid-based extractive methodologies using chlorinated and non-chlorinated organic solvents were tested. The extract obtained is used for vitamin E quantification (normal-phase HPLC with fluorescence detection), total cholesterol (normal-phase HPLC with UV detection), fatty acid profile, and total fat estimation (GC-FID), all accomplished in <40 min. The final methodology presents an adequate linearity range and sensitivity for tocopherol and cholesterol, with intra- and inter-day precisions (RSD) from 3 to 11 % for all the components. The developed methodology was applied to diverse seafood samples with positive outcomes, making it a very attractive technique for routine analyses in standard equipped laboratories in the food quality control field.
Resumo:
In this paper we survey the most relevant results for the prioritybased schedulability analysis of real-time tasks, both for the fixed and dynamic priority assignment schemes. We give emphasis to the worst-case response time analysis in non-preemptive contexts, which is fundamental for the communication schedulability analysis. We define an architecture to support priority-based scheduling of messages at the application process level of a specific fieldbus communication network, the PROFIBUS. The proposed architecture improves the worst-case messages’ response time, overcoming the limitation of the first-come-first-served (FCFS) PROFIBUS queue implementations.
Resumo:
Graphics processors were originally developed for rendering graphics but have recently evolved towards being an architecture for general-purpose computations. They are also expected to become important parts of embedded systems hardware -- not just for graphics. However, this necessitates the development of appropriate timing analysis techniques which would be required because techniques developed for CPU scheduling are not applicable. The reason is that we are not interested in how long it takes for any given GPU thread to complete, but rather how long it takes for all of them to complete. We therefore develop a simple method for finding an upper bound on the makespan of a group of GPU threads executing the same program and competing for the resources of a single streaming multiprocessor (whose architecture is based on NVIDIA Fermi, with some simplifying assunptions). We then build upon this method to formulate the derivation of the exact worst-case makespan (and corresponding schedule) as an optimization problem. Addressing the issue of tractability, we also present a technique for efficiently computing a safe estimate of the worstcase makespan with minimal pessimism, which may be used when finding an exact value would take too long.
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
WiDom is a previously proposed prioritized medium access control protocol for wireless channels. We present a modification to this protocol in order to improve its reliability. This modification has similarities with cooperative relaying schemes, but, in our protocol, all nodes can relay a carrier wave. The preliminary evaluation shows that, under transmission errors, a significant reduction on the number of failed tournaments can be achieved.
Resumo:
Joining of components with structural adhesives is currently one of the most widespread techniques for advanced structures (e.g., aerospace or aeronautical). Adhesive bonding does not involve drilling operations and it distributes the load over a larger area than mechanical joints. However, peak stresses tend to develop near the overlap edges because of differential straining of the adherends and load asymmetry. As a result, premature failures can be expected, especially for brittle adhesives. Moreover, bonded joints are very sensitive to the surface treatment of the material, service temperature, humidity and ageing. To surpass these limitations, the combination of adhesive bonding with spot-welding is a choice to be considered, adding a few advantages like superior static strength and stiffness, higher peeling and fatigue strength and easier fabrication, as fixtures during the adhesive curing are not needed. The experimental and numerical study presented here evaluates hybrid spot-welded/bonded single-lap joints in comparison with the purely spot-welded and bonded equivalents. A parametric study on the overlap length (LO) allowed achieving different strength advantages, up to 58% compared to spot-welded joints and 24% over bonded joints. The Finite Element Method (FEM) and Cohesive Zone Models (CZM) for damage growth were also tested in Abaqus® to evaluate this technique for strength prediction, showing accurate estimations for all kinds of joints.
Resumo:
In this study, an experimental investigation into the shear strength behaviour of aluminium alloy single-lap adhesive joints was carried out in order to understand the effect of temperature on the strength of adhesively bonding joints. Single lap joints (SLJs) were fabricated and tested at RT and high temperatures (100ºC, 125ºC, 150ºC, 175ºC and 200ºC). Results showed that the failure loads of the single-lap joint test specimens vary with temperature and this needs to be considered in any design procedure. It is shown that, although the tensile stress decreased with temperature, the lap-shear strength of the adhesive increased with increasing of temperature up to the glass transition of the adhesive (Tg) and decreased for tests above the Tg.