960 resultados para Homogenization Schemes


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Peniche section has revealed moderately-to-well preserved calcareous nannofossil assemblages across the Pliensbachian/Toarcian boundary. This good record has allowed the proposition of a refined biostratigraphic scheme. The stage boundary, as defined by ammonites, is comprised within the NJ5b C. impontus (NW Europe; BOWN & COOPER, 1998) or the NJT5b L. sigillatus (Mediterranean Tethys; MATTIOLI & ERBA, 1999) nannofossil subzones. Since in the Lusitanian Basin a mixing of N- and S-Tethyan taxa is observed, both biozonation schemes can be applied. Some nannofossil events (mainly first occurrences) are observed earlier in Portugal than in other Tethyan settings. It is still unclear if these events are real first occurrences. A diversification phase occurred across the Pliensbachian/Toarcian boundary. This phase is well recorded at Peniche, where a change is observed passing from the Pliensbachian, when assemblages are dominated by muroliths, to the Toarcian showing assemblages where placoliths are abundant. A quantification of nannofossils per gram of rock shows that absolute abundances are the highest across the Pliensbachian/Toarcian boundary. Indeed, Peniche exhibits nannofossil abundances very high with respect to correlative levels in other Tethyan settings. The pelagic carbonate fraction (produced by nannofossils) is important in the marly hemi-couplets of Peniche. In some levels, nannofossils account for more than 50% of the total carbonate fraction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho Final de mestrado para obtenção do grau de Mestre em engenharia Mecância

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several popular Ansatze of lepton mass matrices that contain texture zeros are confronted with current neutrino observational data. We perform a systematic chi(2) analysis in a wide class of schemes, considering arbitrary Hermitian charged-lepton mass matrices and symmetric mass matrices for Majorana neutrinos or Hermitian mass matrices for Dirac neutrinos. Our study reveals that several patterns are still consistent with all the observations at the 68.27% confidence level, while some others are disfavored or excluded by the experimental data. The well-known Frampton-Glashow-Marfatia two-zero textures, hybrid textures, and parallel structures (among others) are considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de Mestrado apresentada ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Contabilidade e Finanças sobre a orientação do Doutor José Campos Amorim.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Demand response is assumed as an essential resource to fully achieve the smart grids operating benefits, namely in the context of competitive markets and of the increasing use of renewable-based energy sources. Some advantages of Demand Response (DR) programs and of smart grids can only be achieved through the implementation of Real Time Pricing (RTP). The integration of the expected increasing amounts of distributed energy resources, as well as new players, requires new approaches for the changing operation of power systems. The methodology proposed in this paper aims the minimization of the operation costs in a distribution network operated by a virtual power player that manages the available energy resources focusing on hour ahead re-scheduling. When facing lower wind power generation than expected from day ahead forecast, demand response is used in order to minimize the impacts of such wind availability change. In this way, consumers actively participate in regulation up and spinning reserve ancillary services through demand response programs. Real time pricing is also applied. The proposed model is especially useful when actual and day ahead wind forecast differ significantly. Its application is illustrated in this paper implementing the characteristics of a real resources conditions scenario in a 33 bus distribution network with 32 consumers and 66 distributed generators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rehabilitation is becoming more and more usual in the construction sector in Portugal. The introduction of newer construction materials and technical know-how of integrating different materials for achieving desired engineering goals is an important step to the development of the sector. Wood industry is also getting more and more adapted to composite technologies with the introduction of the so called “highly engineered wood products” and with the use of modification treatments. This work is an attempt to explain the viability of using stainless steel and glass fibre reinforced polymer (GFRP) as reinforcements in wood beams. This thesis specifically focuses on the flexural behaviour of Portuguese Pine unmodified and modified wood beams. Two types of modification were used: 1,3-dimethylol-4,5- dihydroxyethyleneurea (DMDHEU) resin and amid wax. The behaviour of the material was analysed with a nonlinear model. The latter model simulates the behaviour of the reinforced wood beams under flexural loading. Small-scale beams (1:15) were experimented in flexural bending and the experimental results obtained were compared with the analytical model results. The experiments confirm the viability of the reinforcing schemes and the working procedures. Experimental results showed fair agreement with the nonlinear model. A strength increase between 15% and 80% was achieved. Stiffness increased by 40% to 50% in beams reinforced with steel but no significant increase was achieved with the glass fibre reinforcement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An individual experiences double coverage when he bene ts from more than one health insurance plan at the same time. This paper examines the impact of such supplementary insurance on the demand for health care services. Its novelty is that within the context of count data modelling and without imposing restrictive parametric assumptions, the analysis is carried out for di¤erent points of the conditional distribution, not only for its mean location. Results indicate that moral hazard is present across the whole outcome distribution for both public and private second layers of health insurance coverage but with greater magnitude in the latter group. By looking at di¤erent points we unveil that stronger double coverage e¤ects are smaller for high levels of usage. We use data for Portugal, taking advantage of particular features of the public and private protection schemes on top of the statutory National Health Service. By exploring the last Portuguese Health Survey, we were able to evaluate their impacts on the consumption of doctor visi

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente artigo intenta sistematizar as formas ou regimes do emprego público em países da União Europeia e da OCDE. A metodologia compreende a análise dos tradicionais sistemas de carreira (career-based system) e sistema de emprego (position-based system) no emprego público. Desenvolvem-se, ainda, breves reflexões nas mudanças operadas naqueles dois regimes que migraram para um terceiro modelo, vulgarmente designado por modelo híbrido de emprego público, mais flexível e mais correlativo às circunstâncias do século XXI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A biomassa é uma das fontes de energia renovável com maior potencial em Portugal, sendo a capacidade de produção de pellets de biomassa atualmente instalada superior a 1 milhão de toneladas/ano. Contudo, a maioria desta produção destina-se à exportação ou à utilização em centrais térmicas a biomassa, cujo crescimento tem sido significativo nos últimos anos, prevendo-se que a capacidade instalada em 2020 seja de aproximadamente 250 MW. O mercado português de caldeiras a pellets é bastante diversificado. O estudo que realizamos permitiu concluir que cerca de 90% das caldeiras existentes no mercado português têm potências inferiores a 60 kW, possuindo na sua maioria grelha fixa (81%), com sistema de ignição eléctrica (92%) e alimentação superior do biocombustível sólido (94%). O objetivo do presente trabalho foi o desenvolvimento de um modelo para simulação de uma caldeira a pellets de biomassa, que para além de permitir otimizar o projeto e operação deste tipo de equipamento, permitisse avaliar as inovações tecnológicas nesta área. Para tal recorreu-se o BiomassGasificationFoam, um código recentemente publicado, e escrito para utilização com o OpenFOAM, uma ferramenta computacional de acesso livre, que permite a simulação dos processos de pirólise, gasificação e combustão de biomassa. Este código, que foi inicialmente desenvolvido para descrever o processo de gasificação na análise termogravimétrica de biomassa, foi por nós adaptado para considerar as reações de combustão em fase gasosa dos gases libertados durante a pirólise da biomassa (recorrendo para tal ao solver reactingFoam), e ter a possibilidade de realizar a ignição da biomassa, o que foi conseguido através de uma adaptação do código de ignição do XiFoam. O esquema de ignição da biomassa não se revelou adequado, pois verificou-se que a combustão parava sempre que a ignição era inativada, independentemente do tempo que ela estivesse ativa. Como alternativa, usaram-se outros dois esquemas para a combustão da biomassa: uma corrente de ar quente, e uma resistência de aquecimento. Ambos os esquemas funcionaram, mas nunca foi possível fazer com que a combustão fosse autossustentável. A análise dos resultados obtidos permitiu concluir que a extensão das reações de pirólise e de gasificação, que são ambas endotérmicas, é muito pequena, pelo que a quantidade de gases libertados é igualmente muito pequena, não sendo suficiente para libertar a energia necessária à combustão completa da biomassa de uma maneira sustentável. Para tentar ultrapassar esta dificuldade foram testadas várias alternativas, , que incluíram o uso de diferentes composições de biomassa, diferentes cinéticas, calores de reação, parâmetros de transferência de calor, velocidades do ar de alimentação, esquemas de resolução numérica do sistema de equações diferenciais, e diferentes parâmetros dos esquemas de resolução utilizados. Todas estas tentativas se revelaram infrutíferas. Este estudo permitiu concluir que o solver BiomassGasificationFoam, que foi desenvolvido para descrever o processo de gasificação de biomassa em meio inerte, e em que a biomassa é aquecida através de calor fornecido pelas paredes do reator, aparentemente não é adequado à descrição do processo de combustão da biomassa, em que a combustão deve ser autossustentável, e em que as reações de combustão em fase gasosa são importantes. Assim, é necessário um estudo mais aprofundado que permita adaptar este código à simulação do processo de combustão de sólidos porosos em leito fixo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents a work-in-progress version of a Dublin Core Application Profile (DCAP) developed to serve the Social and Solidarity Economy (SSE). Studies revealed that this community is interested in implementing both internal interoperability between their Web platforms to build a global SSE e-marketplace, and external interoperability among their Web platforms and external ones. The Dublin Core Application Profile for Social and Solidarity Economy (DCAP-SSE) serves this purpose. SSE organisations are submerged in the market economy but they have specificities not taken into account in this economy. The DCAP-SSE integrates terms from well-known metadata schemas, Resource Description Framework (RDF) vocabularies or ontologies, in order to enhance interoperability and take advantage of the benefits of the Linked Open Data ecosystem. It also integrates terms from the new essglobal RDF vocabulary which was created with the goal to respond to the SSE-specific needs. The DCAP-SSE also integrates five new Vocabulary Encoding Schemes to be used with DCAP-SSE properties. The DCAP development was based on a method for the development of application profiles (Me4MAP). We believe that this article has an educational value since it presents the idea that it is important to base DCAP developments on a method. This article shows the main results of applying such a method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ergonomic interventions such as increased scheduled breaks or job rotation have been proposed to reduce upper limb muscle fatigue in repetitive low-load work. This review was performed to summarize and analyze the studies investigating the effect of job rotation and work-rest schemes, as well as, work pace, cycle time and duty cycle, on upper limb muscle fatigue. The effects of these work organization factors on subjective fatigue or discomfort were also analyzed. This review was based on relevant articles published in PubMed, Scopus and Web of Science. The studies included in this review were performed in humans and assessed muscle fatigue in upper limbs. 14 articles were included in the systematic review. Few studies were performed in a real work environment and the most common methods used to assess muscle fatigue were surface electromyography (EMG). No consistent results were found related to the effects of job rotation on muscle activity and subjective measurements of fatigue. Rest breaks had some positive effects, particularly in perceived discomfort. The increase in work pace reveals a higher muscular load in specific muscles. The duration of experiments and characteristics of participants appear to be the factors that most have influenced the results. Future research should be focused on the improvement of the experimental protocols and instrumentation, in order to the outcomes represent adequately the actual working conditions. Relevance to industry: Introducing more physical workload variation in low-load repetitive work is considered an effective ergonomic intervention against muscle fatigue and musculoskeletal disorders in industry. Results will be useful to identify the need of future research, which will eventually lead to the adoption of best industrial work practices according to the workers capabilities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, vehicular cloud computing (VCC) has emerged as a new technology which is being used in wide range of applications in the area of multimedia-based healthcare applications. In VCC, vehicles act as the intelligent machines which can be used to collect and transfer the healthcare data to the local, or global sites for storage, and computation purposes, as vehicles are having comparatively limited storage and computation power for handling the multimedia files. However, due to the dynamic changes in topology, and lack of centralized monitoring points, this information can be altered, or misused. These security breaches can result in disastrous consequences such as-loss of life or financial frauds. Therefore, to address these issues, a learning automata-assisted distributive intrusion detection system is designed based on clustering. Although there exist a number of applications where the proposed scheme can be applied but, we have taken multimedia-based healthcare application for illustration of the proposed scheme. In the proposed scheme, learning automata (LA) are assumed to be stationed on the vehicles which take clustering decisions intelligently and select one of the members of the group as a cluster-head. The cluster-heads then assist in efficient storage and dissemination of information through a cloud-based infrastructure. To secure the proposed scheme from malicious activities, standard cryptographic technique is used in which the auotmaton learns from the environment and takes adaptive decisions for identification of any malicious activity in the network. A reward and penalty is given by the stochastic environment where an automaton performs its actions so that it updates its action probability vector after getting the reinforcement signal from the environment. The proposed scheme was evaluated using extensive simulations on ns-2 with SUMO. The results obtained indicate that the proposed scheme yields an improvement of 10 % in detection rate of malicious nodes when compared with the existing schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this manuscript we tackle the problem of semidistributed user selection with distributed linear precoding for sum rate maximization in multiuser multicell systems. A set of adjacent base stations (BS) form a cluster in order to perform coordinated transmission to cell-edge users, and coordination is carried out through a central processing unit (CU). However, the message exchange between BSs and the CU is limited to scheduling control signaling and no user data or channel state information (CSI) exchange is allowed. In the considered multicell coordinated approach, each BS has its own set of cell-edge users and transmits only to one intended user while interference to non-intended users at other BSs is suppressed by signal steering (precoding). We use two distributed linear precoding schemes, Distributed Zero Forcing (DZF) and Distributed Virtual Signalto-Interference-plus-Noise Ratio (DVSINR). Considering multiple users per cell and the backhaul limitations, the BSs rely on local CSI to solve the user selection problem. First we investigate how the signal-to-noise-ratio (SNR) regime and the number of antennas at the BSs impact the effective channel gain (the magnitude of the channels after precoding) and its relationship with multiuser diversity. Considering that user selection must be based on the type of implemented precoding, we develop metrics of compatibility (estimations of the effective channel gains) that can be computed from local CSI at each BS and reported to the CU for scheduling decisions. Based on such metrics, we design user selection algorithms that can find a set of users that potentially maximizes the sum rate. Numerical results show the effectiveness of the proposed metrics and algorithms for different configurations of users and antennas at the base stations.