977 resultados para inter-stage line ratio
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Gestão e Sistemas Ambientais
Resumo:
TOD (Transit Oriented Development) is typically defined as a high density mixed area (residential and commercial) within easy walking distance of a high capacity public transport station (typically within an 800m buffer area). TOO is viewed as a set of strategies to increase the use of public transport, increasing walking activity, containing urban sprawl, and creating more liveable places. It is believed that this type of combined strategies will improve sustainable growth. This work is an exploratory work for evidence of TOD characteristics in train station areas in Azambuja train line, setting further methodologies to evaluate the success of TOD areas.
Resumo:
Este guião de apoio à formação, tem como objectivo apoiar docentes em (1)Utilizar fóruns para divulgação de informação e/ou colaboração de ideias e (2)Configurar e utilizar fóruns no moodle.
Resumo:
Este guião de apoio à formação tem como objectivo apoiar docentes na (1)configuração e utilização da actividade questionário do moodle e (2)exportação dos resultados para um documento Excel.
Resumo:
Este guião de apoio à formação tem como objectivo apoiar docentes na (1) configuração e utilização da actividade teste do moodle e (2) criação de bancos de perguntas.
Resumo:
Real-time scheduling usually considers worst-case values for the parameters of task (or message stream) sets, in order to provide safe schedulability tests for hard real-time systems. However, worst-case conditions introduce a level of pessimism that is often inadequate for a certain class of (soft) real-time systems. In this paper we provide an approach for computing the stochastic response time of tasks where tasks have inter-arrival times described by discrete probabilistic distribution functions, instead of minimum inter-arrival (MIT) values.
Resumo:
With advancement in computer science and information technology, computing systems are becoming increasingly more complex with an increasing number of heterogeneous components. They are thus becoming more difficult to monitor, manage, and maintain. This process has been well known as labor intensive and error prone. In addition, traditional approaches for system management are difficult to keep up with the rapidly changing environments. There is a need for automatic and efficient approaches to monitor and manage complex computing systems. In this paper, we propose an innovative framework for scheduling system management by combining Autonomic Computing (AC) paradigm, Multi-Agent Systems (MAS) and Nature Inspired Optimization Techniques (NIT). Additionally, we consider the resolution of realistic problems. The scheduling of a Cutting and Treatment Stainless Steel Sheet Line will be evaluated. Results show that proposed approach has advantages when compared with other scheduling systems
Resumo:
The goal of this paper is to discuss the benefits and challenges of yielding an inter-continental network of remote laboratories supported and used by both European and Latin American Institutions of Higher Education. Since remote experimentation, understood as the ability to carry out real-world experiments through a simple Web browser, is already a proven solution for the educational community as a supplement to on-site practical lab work (and in some cases, namely for distance learning courses, a replacement to that work), the purpose is not to discuss its technical, pedagogical, or economical strengths, but rather to raise and try to answer some questions about the underlying benefits and challenges of establishing a peer-to-peer network of remote labs. Ultimately, we regard such a network as a constructive mechanism to help students gain the working and social skills often valued by multinational/global companies, while also providing awareness of local cultural aspects.
Resumo:
Dependability is a critical factor in computer systems, requiring high quality validation & verification procedures in the development stage. At the same time, digital devices are getting smaller and access to their internal signals and registers is increasingly complex, requiring innovative debugging methodologies. To address this issue, most recent microprocessors include an on-chip debug (OCD) infrastructure to facilitate common debugging operations. This paper proposes an enhanced OCD infrastructure with the objective of supporting the verification of fault-tolerant mechanisms through fault injection campaigns. This upgraded on-chip debug and fault injection (OCD-FI) infrastructure provides an efficient fault injection mechanism with improved capabilities and dynamic behavior. Preliminary results show that this solution provides flexibility in terms of fault triggering and allows high speed real-time fault injection in memory elements
Resumo:
Aim: Optimise a set of exposure factors, with the lowest effective dose, to delineate spinal curvature with the modified Cobb method in a full spine using computed radiography (CR) for a 5-year-old paediatric anthropomorphic phantom. Methods: Images were acquired by varying a set of parameters: positions (antero-posterior (AP), posteroanterior (PA) and lateral), kilo-voltage peak (kVp) (66-90), source-to-image distance (SID) (150 to 200cm), broad focus and the use of a grid (grid in/out) to analyse the impact on E and image quality (IQ). IQ was analysed applying two approaches: objective [contrast-to-noise-ratio/(CNR] and perceptual, using 5 observers. Monte-Carlo modelling was used for dose estimation. Cohen’s Kappa coefficient was used to calculate inter-observer-variability. The angle was measured using Cobb’s method on lateral projections under different imaging conditions. Results: PA promoted the lowest effective dose (0.013 mSv) compared to AP (0.048 mSv) and lateral (0.025 mSv). The exposure parameters that allowed lower dose were 200cm SID, 90 kVp, broad focus and grid out for paediatrics using an Agfa CR system. Thirty-seven images were assessed for IQ and thirty-two were classified adequate. Cobb angle measurements varied between 16°±2.9 and 19.9°±0.9. Conclusion: Cobb angle measurements can be performed using the lowest dose with a low contrast-tonoise ratio. The variation on measurements for this was ±2.9° and this is within the range of acceptable clinical error without impact on clinical diagnosis. Further work is recommended on improvement to the sample size and a more robust perceptual IQ assessment protocol for observers.
Resumo:
Eucalyptus globulus sapwood and heartwood showed no differences in lignin content (23.0% vs. 23.7%) and composition: syringyl-lignin (17.9% vs. 18.0%) and guaiacyl-lignin (4.8% vs. 5.2%). Delignification kinetics of S- and G-units in heartwood and sapwood was investigated by Py-GC–MS/FID at 130, 150 and 170 °C and modeled as double first-order reactions. Reactivity differences between S and G-units were small during the main pulping phase and the higher reactivity of S over G units was better expressed in the later pulping stage. The residual lignin composition in pulps was different from wood or from samples in the initial delignification stages, with more G and H-units. S/G ratio ranged from 3 to 4.5 when pulp residual lignin was higher than 10%, decreasing rapidly to less than 1. The S/H was initially around 20 (until 15% residual lignin), decreasing to 4 when residual lignin was about 3%.
Resumo:
This paper models an n-stage stacked Blumlein generator for bipolar pulses for various load conditions. Calculation of the voltage amplitudes in time domain at the load and between stages is described for an n-stage generator. For this, the reflection and transmission coefficients are mathematically modeled where impedance discontinuity occurs (i.e., at the junctions between two transmission lines). The mathematical model developed is assessed by comparing simulation results to experimental data from a two-stage Blumlein solid-state prototype.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
To boost logic density and reduce per unit power consumption SRAM-based FPGAs manufacturers adopted nanometric technologies. However, this technology is highly vulnerable to radiation-induced faults, which affect values stored in memory cells, and to manufacturing imperfections. Fault tolerant implementations, based on Triple Modular Redundancy (TMR) infrastructures, help to keep the correct operation of the circuit. However, TMR is not sufficient to guarantee the safe operation of a circuit. Other issues like module placement, the effects of multi- bit upsets (MBU) or fault accumulation, have also to be addressed. In case of a fault occurrence the correct operation of the affected module must be restored and/or the current state of the circuit coherently re-established. A solution that enables the autonomous restoration of the functional definition of the affected module, avoiding fault accumulation, re-establishing the correct circuit state in real-time, while keeping the normal operation of the circuit, is presented in this paper.
Resumo:
Dynamically reconfigurable systems have benefited from a new class of FPGAs recently introduced into the market, which allow partial and dynamic reconfiguration at run-time, enabling multiple independent functions from different applications to share the same device, swapping resources as needed. When the sequence of tasks to be performed is not predictable, resource allocation decisions have to be made on-line, fragmenting the FPGA logic space. A rearrangement may be necessary to get enough contiguous space to efficiently implement incoming functions, to avoid spreading their components and, as a result, degrading their performance. This paper presents a novel active replication mechanism for configurable logic blocks (CLBs), able to implement on-line rearrangements, defragmenting the available FPGA resources without disturbing those functions that are currently running.