993 resultados para Founding Process. Request for Proposal
Resumo:
P-NET is a multi-master fieldbus standard based on a virtual token passing scheme. In P-NET each master is allowed to transmit only one message per token visit. In the worst-case, the communication response time can be derived considering that, in each token cycle, all stations use the token to transmit a message. In this paper, we define a more sophisticated P-NET model, which considers the actual token utilisation. We then analyse the possibility of implementing a local priority-based scheduling policy to improve the real-time behaviour of P-NET.
Resumo:
This paper describes how MPEG-4 object based video (obv) can be used to allow selected objects to be inserted into the play-out stream to a specific user based on a profile derived for that user. The application scenario described here is for personalized product placement, and considers the value of this application in the current and evolving commercial media distribution market given the huge emphasis media distributors are currently placing on targeted advertising. This level of application of video content requires a sophisticated content description and metadata system (e.g., MPEG-7). The scenario considers the requirement for global libraries to provide the objects to be inserted into the streams. The paper then considers the commercial trading of objects between the libraries, video service providers, advertising agencies and other parties involved in the service. Consequently a brokerage of video objects is proposed based on negotiation and trading using intelligent agents representing the various parties. The proposed Media Brokerage Platform is a multi-agent system structured in two layers. In the top layer, there is a collection of coarse grain agents representing the real world players – the providers and deliverers of media contents and the market regulator profiler – and, in the bottom layer, there is a set of finer grain agents constituting the marketplace – the delegate agents and the market agent. For knowledge representation (domain, strategic and negotiation protocols) we propose a Semantic Web approach based on ontologies. The media components contents should be represented in MPEG-7 and the metadata describing the objects to be traded should follow a specific ontology. The top layer content providers and deliverers are modelled by intelligent autonomous agents that express their will to transact – buy or sell – media components by registering at a service registry. The market regulator profiler creates, according to the selected profile, a market agent, which, in turn, checks the service registry for potential trading partners for a given component and invites them for the marketplace. The subsequent negotiation and actual transaction is performed by delegate agents in accordance with their profiles and the predefined rules of the market.
Resumo:
Doutoramento em Motricidade Humana na especialidade de Dança
Resumo:
I (Prática Pedagógica) - Neste relatório de estágio apresenta-se uma caracterização do CRP, contextualizando um pouco da sua história, o seu funcionamento e os seus objetivos pedagógicos. Caracterizam-se, também, os alunos que participaram no estágio, destacando o seu percurso académico, as suas influências e motivações musicais. Nas práticas educativas desenvolvidas apresentam-se os princípios pedagógicos, segundo o portal Ponazapino, e os métodos de ensino lecionados durante o ano letivo que tiveram em conta o processo integrado de Ensino/Aprendizagem (Teaching and Learning). Por último apresentam-se os objetivos pedagógicos propostos para cada aluno do estágio. No final efetua-se uma análise crítica da atividade docente destacando o processo ensino/aprendizagem, a sua aplicação e benefícios no desenvolvimento integral do indivíduo.
Resumo:
Relatório de Estágio para obtenção do grau de Mestre em Engenharia Civil
Resumo:
Secure group communication is a paradigm that primarily designates one-to-many communication security. The proposed works relevant to secure group communication have predominantly considered the whole network as being a single group managed by a central powerful node capable of supporting heavy communication, computation and storage cost. However, a typical Wireless Sensor Network (WSN) may contain several groups, and each one is maintained by a sensor node (the group controller) with constrained resources. Moreover, the previously proposed schemes require a multicast routing support to deliver the rekeying messages. Nevertheless, multicast routing can incur heavy storage and communication overheads in the case of a wireless sensor network. Due to these two major limitations, we have reckoned it necessary to propose a new secure group communication with a lightweight rekeying process. Our proposal overcomes the two limitations mentioned above, and can be applied to a homogeneous WSN with resource-constrained nodes with no need for a multicast routing support. Actually, the analysis and simulation results have clearly demonstrated that our scheme outperforms the previous well-known solutions.
Resumo:
The purpose of this article is to analyse and evaluate the economical, energetic and environmental impacts of the increasing penetration of renewable energies and electrical vehicles in isolated systems, such as Terceira Island in Azores and Madeira Island. Given the fact that the islands are extremely dependent on the importation of fossil fuels - not only for the production of energy, but also for the transportation’s sector – it’s intended to analyse how it is possible to reduce that dependency and determine the resultant reduction of pollutant gas emissions. Different settings have been analysed - with and without the penetration of EVs. The Terceira Island is an interesting case study, where EVs charging during off-peak hours could allow an increase in geothermal power, limited by the valley of power demand. The percentage of renewable energy in the electric power mix could reach the 74% in 2030 while at the same time, it is possible to reduce the emissions of pollutant gases in 45% and the purchase of fossil fuels in 44%. In Madeira, apart from wind, solar and small hydro power, there are not so many endogenous resources and the Island’s emission factor cannot be so reduced as in Terceira. Although, it is possible to reduce fossil fuels imports and emissions in 1.8% in 2030 when compared with a BAU scenario with a 14% of the LD fleet composed by EVs.
Resumo:
The purpose of this article is to analyse and evaluate the economical, energetic and environmental impacts of the increasing penetration of renewable energies and electrical vehicles in isolated systems, such as Terceira Island in Azores and Madeira Island. Given the fact that the islands are extremely dependent on the importation of fossil fuels - not only for the production of energy, but also for the transportation’s sector – it’s intended to analyse how it is possible to reduce that dependency and determine the resultant reduction of pollutant gas emissions. Different settings have been analysed - with and without the penetration of EVs. The Terceira Island is an interesting case study, where EVs charging during off-peak hours could allow an increase in geothermal power, limited by the valley of power demand. The percentage of renewable energy in the electric power mix could reach the 74% in 2030 while at the same time, it is possible to reduce the emissions of pollutant gases in 45% and the purchase of fossil fuels in 44%. In Madeira, apart from wind, solar and small hydro power, there are not so many endogenous resources and the Island’s emission factor cannot be so reduced as in Terceira. Although, it is possible to reduce fossil fuels imports and emissions in 1.8% in 2030 when compared with a BAU scenario with a 14% of the LD fleet composed by EVs.
Resumo:
We present a 12*(1+|R|/(4m))-speed algorithm for scheduling constrained-deadline sporadic real-time tasks on a multiprocessor comprising m processors where a task may request one of |R| sequentially-reusable shared resources.
Resumo:
Mestrado em Contabilidade e Gestão das Instituições Financeiras
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
The foresight and scenario building methods can be an interesting reference for social sciences, especially in terms of innovative methods for labour process analysis. A scenario – as a central concept for the prospective analysis – can be considered as a rich and detailed portrait of a plausible future world. It can be a useful tool for policy-makers to grasp problems clearly and comprehensively, and to better pinpoint challenges as well as opportunities in an overall framework. The features of the foresight methods are being used in some labour policy making experiences. Case studies developed in Portugal will be presented, and some conclusions will be drawn in order to organise a set of principles for foresight analysis applied to the European project WORKS on the work organisation re-structuring in the knowledge society, and on the work design methods for new management structures of virtual organisations.
Resumo:
The aim of this paper is to evaluate the influence of the crushing process used to obtain recycled concrete aggregates on the performance of concrete made with those aggregates. Two crushing methods were considered: primary crushing, using a jaw crusher, and primary plus secondary crushing (PSC), using a jaw crusher followed by a hammer mill. Besides natural aggregates (NA), these two processes were also used to crush three types of concrete made in laboratory (L20, L45 e L65) and three more others from the precast industry (P20, P45 e P65). The coarse natural aggregates were totally replaced by coarse recycled concrete aggregates. The recycled aggregates concrete mixes were compared with reference concrete mixes made using only NA, and the following properties related to the mechanical and durability performance were tested: compressive strength; splitting tensile strength; modulus of elasticity; carbonation resistance; chloride penetration resistance; water absorption by capillarity; water absorption by immersion; and shrinkage. The results show that the PSC process leads to better performances, especially in the durability properties. © 2014 RILEM
Resumo:
The IEEE 802.15.4 Medium Access Control (MAC) protocol is an enabling technology for time sensitive wireless sensor networks thanks to its Guaranteed-Time Slot (GTS) mechanism in the beacon-enabled mode. However, the protocol only supports explicit GTS allocation, i.e. a node allocates a number of time slots in each superframe for exclusive use. The limitation of this explicit GTS allocation is that GTS resources may quickly disappear, since a maximum of seven GTSs can be allocated in each superframe, preventing other nodes to benefit from guaranteed service. Moreover, the GTSs may be only partially used, resulting in wasted bandwidth. To overcome these limitations, this paper proposes i-GAME, an implicit GTS Allocation Mechanism in beacon-enabled IEEE 802.15.4 networks. The allocation is based on implicit GTS allocation requests, taking into account the traffic specifications and the delay requirements of the flows. The i-GAME approach enables the use of a GTS by multiple nodes, while all their (delay, bandwidth) requirements are still satisfied. For that purpose, we propose an admission control algorithm that enables to decide whether to accept a new GTS allocation request or not, based not only on the remaining time slots, but also on the traffic specifications of the flows, their delay requirements and the available bandwidth resources. We show that our proposal improves the bandwidth utilization compared to the explicit allocation used in the IEEE 802.15.4 protocol standard. We also present some practical considerations for the implementation of i-GAME, ensuring backward compatibility with the IEEE 801.5.4 standard with only minor add-ons.
Resumo:
With the current complexity of communication protocols, implementing its layers totally in the kernel of the operating system is too cumbersome, and it does not allow use of the capabilities only available in user space processes. However, building protocols as user space processes must not impair the responsiveness of the communication. Therefore, in this paper we present a layer of a communication protocol, which, due to its complexity, was implemented in a user space process. Lower layers of the protocol are, for responsiveness issues, implemented in the kernel. This protocol was developed to support large-scale power-line communication (PLC) with timing requirements.