960 resultados para set based design


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the design and implementation of a measurement-based QoS and resource management framework, CNQF (Converged Networks’ QoS Management Framework). CNQF is designed to provide unified, scalable QoS control and resource management through the use of a policy-based network
management paradigm. It achieves this via distributed functional entities that are deployed to co-ordinate the resources of the transport network through centralized policy-driven decisions supported by measurement-based control architecture. We present the CNQF architecture, implementation of the
prototype and validation of various inbuilt QoS control mechanisms using real traffic flows on a Linux-based experimental test bed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The School of Mechanical and Aerospace Engineering at Queen’s University Belfast started BEng and MEng degree programmes in Product Design and Development (PDD) in 2004. Intended from the outset to be significantly different from the existing programmes within the School the PDD degrees used the syllabus and standards defined by the CDIO Initiative as the basis for an integrated curriculum. Students are taught in the context of conceiving, designing, implementing and operating a product. Fundamental to this approach is a core sequence of Design-Build-Test (DBT) experiences which facilitates the development of a range of professional skills as well as the immediate application of technical knowledge gained in strategically aligned supporting modules.
The key objective of the degree programmes is to better prepare students for professional practice. PDD graduates were surveyed using a questionnaire developed by the CDIO founders and interviewed to examine the efficacy of these degree programmes, particularly in this key objective. Graduate employment rates, self assessment of graduate attributes and examples of work produced by MEng graduates provided positive evidence that their capabilities met the requirements of the profession. The 24% questionnaire response rate from the 96 graduates to date did not however facilitate statistically significant conclusions to be drawn and particularly not for BEng graduates who were under represented in the response group. While not providing proof of efficacy the investigation did provide a good amount of useful data for consideration as part of a continuous improvement process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Education has a powerful and long-term effect on people’s lives and therefore should be based on evidence of what works best. This assertion warrants a definition of what constitutes good research evidence. Two research designs that are often thought to come from diametrically opposed fields, single-subject research designs and randomised controlled-trials, are described and common features, such as the use of probabilistic assumptions and the aim of discovering causal relations are delineated. Differences between the two research designs are also highlighted and this is used as the basis to set out how these two research designs might better be used to complement one another. Recommendations for future action are made accordingly.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Lovastatin biosynthesis depends on the relative concentrations of dissolved oxygen and the carbon and nitrogen resources. An elucidation of the underlying relationship would facilitate the derivation of a controller for the improvement of lovastatin yield in bioprocesses. To achieve this goal, batch submerged cultivation experiments of lovastatin production by Aspergillus flavipus BICC 5174, using both lactose and glucose as carbon sources, were performed in a 7 liter bioreactor and the data used to determine how the relative concentrations of lactose, glucose, glutamine and oxygen affected lovastatin yield. A model was developed based on these results and its prediction was validated using an independent set of batch data obtained from a 15-liter bioreactor using five statistical measures, including the Willmott index of agreement. A nonlinear controller was designed considering that dissolved oxygen and lactose concentrations could be measured online, and using the lactose feed rate and airflow rate as process inputs. Simulation experiments were performed to demonstrate that a practical implementation of the nonlinear controller would result in satisfactory outcomes. This is the first model that correlates lovastatin biosynthesis to carbon-nitrogen proportion and possesses a structure suitable for implementing a strategy for controlling lovastatin production.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Following the UK Medical Research Council’s (MRC) guidelines for the development and evaluation of complex interventions, this study aimed to design, develop and optimise an educational intervention about young men and unintended teenage pregnancy based around an interactive film. The process involved identification of the relevant evidence base, development of a theoretical understanding of the phenomenon of unintended teenage pregnancy in relation to young men, and exploratory mixed methods research. The result was an evidence-based, theory-informed, user-endorsed intervention designed to meet the much neglected pregnancy education needs of teenage men and intended to increase both boys’ and girls’ intentions to avoid an unplanned pregnancy during adolescence. In prioritising the development phase, this paper addresses a gap in the literature on the processes of research-informed intervention design. It illustrates the application of the MRC guidelines in practice while offering a critique and additional guidance to programme developers on the MRC prescribed processes of developing interventions. Key lessons learned were: 1) know and engage the target population and engage gatekeepers in addressing contextual complexities; 2) know the targeted behaviours and model a process of change; and 3) look beyond development to evaluation and implementation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper presents IPPro which is a high performance, scalable soft-core processor targeted for image processing applications. It has been based on the Xilinx DSP48E1 architecture using the ZYNQ Field Programmable Gate Array and is a scalar 16-bit RISC processor that operates at 526MHz, giving 526MIPS of performance. Each IPPro core uses 1 DSP48, 1 Block RAM and 330 Kintex-7 slice-registers, thus making the processor as compact as possible whilst maintaining flexibility and programmability. A key aspect of the approach is in reducing the application design time and implementation effort by using multiple IPPro processors in a SIMD mode. For different applications, this allows us to exploit different levels of parallelism and mapping for the specified processing architecture with the supported instruction set. In this context, a Traffic Sign Recognition (TSR) algorithm has been prototyped on a Zedboard with the colour and morphology operations accelerated using multiple IPPros. Simulation and experimental results demonstrate that the processing platform is able to achieve a speedup of 15 to 33 times for colour filtering and morphology operations respectively, with a reduced design effort and time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hardware designers and engineers typically need to explore a multi-parametric design space in order to find the best configuration for their designs using simulations that can take weeks to months to complete. For example, designers of special purpose chips need to explore parameters such as the optimal bitwidth and data representation. This is the case for the development of complex algorithms such as Low-Density Parity-Check (LDPC) decoders used in modern communication systems. Currently, high-performance computing offers a wide set of acceleration options, that range from multicore CPUs to graphics processing units (GPUs) and FPGAs. Depending on the simulation requirements, the ideal architecture to use can vary. In this paper we propose a new design flow based on OpenCL, a unified multiplatform programming model, which accelerates LDPC decoding simulations, thereby significantly reducing architectural exploration and design time. OpenCL-based parallel kernels are used without modifications or code tuning on multicore CPUs, GPUs and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL for mapping the simulations into FPGAs. To the best of our knowledge, this is the first time that a single, unmodified OpenCL code is used to target those three different platforms. We show that, depending on the design parameters to be explored in the simulation, on the dimension and phase of the design, the GPU or the FPGA may suit different purposes more conveniently, providing different acceleration factors. For example, although simulations can typically execute more than 3x faster on FPGAs than on GPUs, the overhead of circuit synthesis often outweighs the benefits of FPGA-accelerated execution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we propose a system level design approach considering voltage over-scaling (VOS) that achieves error resiliency using unequal error protection of different computation elements, while incurring minor quality degradation. Depending on user specifications and severity of process variations/channel noise, the degree of VOS in each block of the system is adaptively tuned to ensure minimum system power while providing "just-the-right" amount of quality and robustness. This is achieved, by taking into consideration block level interactions and ensuring that under any change of operating conditions, only the "less-crucial" computations, that contribute less to block/system output quality, are affected. The proposed approach applies unequal error protection to various blocks of a system-logic and memory-and spans multiple layers of design hierarchy-algorithm, architecture and circuit. The design methodology when applied to a multimedia subsystem shows large power benefits ( up to 69% improvement in power consumption) at reasonable image quality while tolerating errors introduced due to VOS, process variations, and channel noise.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper the tracking system used to perform a scaled vehicle-barrier crash test is reported. The scaled crash test was performed as part of a wider project aimed at designing a new safety barrier making use of natural building materials. The scaled crash test was designed and performed as a proof of concept of the new mass-based safety barriers and the study was composed of two parts: the scaling technique and of a series of performed scaled crash tests. The scaling method was used for 1) setting the scaled test impact velocity so that energy dissipation and momentum transferring, from the car to the barrier, can be reproduced and 2) predicting the acceleration, velocity and displacement values occurring in the full-scale impact from the results obtained in a scaled test. To achieve this goal the vehicle and barrier displacements were to be recorded together with the vehicle accelerations and angular velocities. These quantities were measured during the tests using acceleration sensors and a tracking system. The tracking system was composed of a high speed camera and a set of targets to measure the vehicle linear and angular velocities. A code was developed to extract the target velocities from the videos and the velocities obtained were then compared with those obtained integrating the accelerations provided by the sensors to check the reliability of the method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Continuous research endeavors on hard turning (HT), both on machine tools and cutting tools, have made the previously reported daunting limits easily attainable in the modern scenario. This presents an opportunity for a systematic investigation on finding the current attainable limits of hard turning using a CNC turret lathe. Accordingly, this study aims to contribute to the existing literature by providing the latest experimental results of hard turning of AISI 4340 steel (69 HRC) using a CBN cutting tool. An orthogonal array was developed using a set of judiciously chosen cutting parameters. Subsequently, the longitudinal turning trials were carried out in accordance with a well-designed full factorial-based Taguchi matrix. The speculation indeed proved correct as a mirror finished optical quality machined surface (an average surface roughness value of 45 nm) was achieved by the conventional cutting method. Furthermore, Signal-to-noise (S/N) ratio analysis, Analysis of variance (ANOVA), and Multiple regression analysis were carried out on the experimental datasets to assert the dominance of each machining variable in dictating the machined surface roughness and to optimize the machining parameters. One of the key findings was that when feed rate during hard turning approaches very low (about 0.02mm/rev), it could alone be most significant (99.16%) parameter in influencing the machined surface roughness (Ra). This has, however also been shown that low feed rate results in high tool wear, so the selection of machining parameters for carrying out hard turning must be governed by a trade-off between the cost and quality considerations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Best concrete research paper by a student - Research has shown that the cost of managing structures puts high strain on the infrastructure budget, with
estimates of over 50% of the European construction budget being dedicated to repair and maintenance. If reinforced concrete
structures are not suitably designed and adequately maintained, their service life is compromised, resulting in the full economic
value of the investment not realised. The issue is more prevalent in coastal structures as a result of combinations of aggressive
actions, such as those caused by chlorides, sulphates and cyclic freezing and thawing.
It is a common practice nowadays to ensure durability of reinforced concrete structures by specifying a concrete mix and a
nominal cover at the design stage to cater for the exposure environment. This in theory should produce the performance required
to achieve a specified service life. Although the European Standard EN 206-1 specifies variations in the exposure environment,
it does not take into account the macro and micro climates surrounding structures, which have a significant influence on their
performance and service life. Therefore, in order to construct structures which will perform satisfactorily in different exposure
environments, the following two aspects need to be developed: a performance based specification to supplement EN 206-1
which will outline the expected performance of the structure in a given environment; and a simple yet transferrable procedure
for assessing the performance of structures in service termed KPI Theory. This will allow the asset managers not only to design
structures for the intended service life, but also to take informed maintenance decisions should the performance in service fall
short of what was specified. This paper aims to discuss this further.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a surrogate-model based optimization of a doubly-fed induction generator (DFIG) machine winding design for maximizing power yield. Based on site-specific wind profile data and the machine’s previous operational performance, the DFIG’s stator and rotor windings are optimized to match the maximum efficiency with operating conditions for rewinding purposes. The particle swarm optimization (PSO)-based surrogate optimization techniques are used in conjunction with the finite element method (FEM) to optimize the machine design utilizing the limited available information for the site-specific wind profile and generator operating conditions. A response surface method in the surrogate model is developed to formulate the design objectives and constraints. Besides, the machine tests and efficiency calculations follow IEEE standard 112-B. Numerical and experimental results validate the effectiveness of the proposed technologies.