973 resultados para multi-source harvesting


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An EMI filter design procedure for power converters is proposed. Based on a given noise spectrum, information about the converter noise source impedance and design constraints, the design space of the input filter is defined. The design is based on component databases and detailed models of the filter components, including high frequency parasitics, losses, weight, volume, etc.. The design space is mapped onto a performance space in which different filter implementations are evaluated and compared. A multi-objective optimization approach is used to obtain optimal designs w.r.t. a given performance function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning Objects facilitate reuse leading to cost and time savings as well as to the enhancement of the quality of educational resources. However, teachers find it difficult to create or to find high quality Learning Objects, and the ones they find need to be customized. Teachers can overcome this problem using suitable authoring systems that enable them to create high quality Learning Objects with little effort. This paper presents an open source online e-Learning authoring tool called ViSH Editor together with four novel interactive Learning Objects that can be created with it: Flashcards, Virtual Tours, Enriched Videos and Interactive Presentations. All these Learning Objects are created as web applications, which can be accessed via mobile devices. Besides, they can be exported to SCORM including their metadata in IEEE LOM format. All of them are described in the paper including an example of each. This approach for creating Learning Objects was validated through two evaluations: a survey among authors and a formal quality evaluation of 209 Learning Objects created with the tool. The results show that ViSH Editor facilitates educators the creation of high quality Learning Objects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional Text-To-Speech (TTS) systems have been developed using especially-designed non-expressive scripted recordings. In order to develop a new generation of expressive TTS systems in the Simple4All project, real recordings from the media should be used for training new voices with a whole new range of speaking styles. However, for processing this more spontaneous material, the new systems must be able to deal with imperfect data (multi-speaker recordings, background and foreground music and noise), filtering out low-quality audio segments and creating mono-speaker clusters. In this paper we compare several architectures for combining speaker diarization and music and noise detection which improve the precision and overall quality of the segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The amount of genomic and proteomic data that is entered each day into databases and the experimental literature is outstripping the ability of experimental scientists to keep pace. While generic databases derived from automated curation efforts are useful, most biological scientists tend to focus on a class or family of molecules and their biological impact. Consequently, there is a need for molecular class-specific or other specialized databases. Such databases collect and organize data around a single topic or class of molecules. If curated well, such systems are extremely useful as they allow experimental scientists to obtain a large portion of the available data most relevant to their needs from a single source. We are involved in the development of two such databases with substantial pharmacological relevance. These are the GPCRDB and NucleaRDB information systems, which collect and disseminate data related to G protein-coupled receptors and intra-nuclear hormone receptors, respectively. The GPCRDB was a pilot project aimed at building a generic molecular class-specific database capable of dealing with highly heterogeneous data. A first version of the GPCRDB project has been completed and it is routinely used by thousands of scientists. The NucleaRDB was started recently as an application of the concept for the generalization of this technology. The GPCRDB is available via the WWW at http://www.gpcr.org/7tm/ and the NucleaRDB at http://www.receptors.org/NR/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Devido às tendências de crescimento da quantidade de dados processados e a crescente necessidade por computação de alto desempenho, mudanças significativas estão acontecendo no projeto de arquiteturas de computadores. Com isso, tem-se migrado do paradigma sequencial para o paralelo, com centenas ou milhares de núcleos de processamento em um mesmo chip. Dentro desse contexto, o gerenciamento de energia torna-se cada vez mais importante, principalmente em sistemas embarcados, que geralmente são alimentados por baterias. De acordo com a Lei de Moore, o desempenho de um processador dobra a cada 18 meses, porém a capacidade das baterias dobra somente a cada 10 anos. Esta situação provoca uma enorme lacuna, que pode ser amenizada com a utilização de arquiteturas multi-cores heterogêneas. Um desafio fundamental que permanece em aberto para estas arquiteturas é realizar a integração entre desenvolvimento de código embarcado, escalonamento e hardware para gerenciamento de energia. O objetivo geral deste trabalho de doutorado é investigar técnicas para otimização da relação desempenho/consumo de energia em arquiteturas multi-cores heterogêneas single-ISA implementadas em FPGA. Nesse sentido, buscou-se por soluções que obtivessem o melhor desempenho possível a um consumo de energia ótimo. Isto foi feito por meio da combinação de mineração de dados para a análise de softwares baseados em threads aliadas às técnicas tradicionais para gerenciamento de energia, como way-shutdown dinâmico, e uma nova política de escalonamento heterogeneity-aware. Como principais contribuições pode-se citar a combinação de técnicas de gerenciamento de energia em diversos níveis como o nível do hardware, do escalonamento e da compilação; e uma política de escalonamento integrada com uma arquitetura multi-core heterogênea em relação ao tamanho da memória cache L1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents part of the findings of a multi-method study into employee perceptions of fairness in relation to the organisational career management (OCM) practices of a large financial retailer. It focuses on exploring how employees construct fairness judgements of their career experiences and the role played by the organisational context and, in particular, OCM practices in forming these judgements. It concludes that individuals can, and do, separate the source and content of (in)justice when it comes to evaluating these experiences. The relative roles of the employer, line manager and career development opportunities in influencing employee fairness evaluations are discussed. Conceptual links with organisational justice theory are proposed, and it is argued that the academic and practitioner populations are provided with empirical evidence for a new theoretical framework for evaluating employee perceptions of, and reactions to, OCM practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a generic strategic framework of alternative international marketing strategies and market segmentation based on intra- and inter-cultural behavioural homogeneity. Consumer involvement (CI) is proposed as a pivotal construct to capture behavioural homogeneity, for the identification of market segments. Results from a five-country study demonstrate how the strategic framework can be valuable in managerial decision-making. First, there is evidence for the cultural invariance of the measurement of CI, allowing a true comparison of inter- and intra-cultural behavioural homogeneity. Second, CI influences purchase behaviour, and its evaluation provides a rich source of information for responsive market segmentation. Finally, a decomposition of behavioural variance suggests that national-cultural environment and nationally transcendent variables explain differences in behaviour. The Behavioural Homogeneity Evaluation Framework therefore suggests appropriate international marketing strategies, providing practical guidance for implementing involvement-contingent strategies. © 2007 Academy of International Business. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this research was defined by a poorly characterised filtration train employed to clarify culture broth containing monoclonal antibodies secreted by GS-NSO cells: the filtration train blinded unpredictably and the ability of the positively charged filters to adsorb DNA from process material was unknown. To direct the development of an assay to quantify the ability of depth filters to adsorb DNA, the molecular weight of DNA from a large-scale, fed-batch, mammalian cell culture vessel was evaluated as process material passed through the initial stages of the purification scheme. High molecular weight DNA was substantially cleared from the broth after passage through a disc stack centrifuge and the remaining low molecular weight DNA was largely unaffected by passage through a series of depth filters and a sterilising grade membrane. Removal of high molecular weight DNA was shown to be coupled with clarification of the process stream. The DNA from cell culture supernatant showed a pattern of internucleosomal cleavage of chromatin when fractionated by electrophoresis but the presence of both necrotic and apoptotic cells throughout the fermentation meant that the origin of the fragmented DNA could not be unequivocally determined. An intercalating fluorochrome, PicoGreen, was elected for development of a suitable DNA assay because of its ability to respond to low molecular weight DNA. It was assessed for its ability to determine the concentration of DNA in clarified mammalian cell culture broths containing pertinent monoclonal antibodies. Fluorescent signal suppression was ameliorated by sample dilution or by performing the assay above the pI of secreted IgG. The source of fluorescence in clarified culture broth was validated by incubation with RNase A and DNase I. At least 89.0 % of fluorescence was attributable to nucleic acid and pre-digestion with RNase A was shown to be a requirement for successful quantification of DNA in such samples. Application of the fluorescence based assay resulted in characterisation of the physical parameters governing adsorption of DNA by various positively charged depth filters and membranes in test solutions and the DNA adsorption profile of the manufacturing scale filtration train. Buffers that reduced or neutralised the depth filter or membrane charge, and those that impeded hydrophobic interactions were shown to affect their operational capacity, demonstrating that DNA was adsorbed by a combination of electrostatic and hydrophobic interactions. Production-scale centrifugation of harvest broth containing therapeutic protein resulted in the reduction of total DNA in the process stream from 79.8 μg m1-1 to 9.3 μg m1-1 whereas the concentration of DNA in the supernatant of pre-and post-filtration samples had only marginally reduced DNA content: from 6.3 to 6.0 μg m1-1 respectively. Hence the filtration train was shown to ineffective in DNA removal. Historically, blinding of the depth filters had been unpredictable with data such as numbers of viable cells, non-viable cells, product titre, or process shape (batch, fed-batch, or draw and fill) failing to inform on the durability of depth filters in the harvest step. To investigate this, key fouling contaminants were identified by challenging depth filters with the same mass of one of the following: viable healthy cells, cells that had died by the process of apoptosis, and cells that had died through the process of necrosis. The pressure increase across a Cuno Zeta Plus 10SP depth filter was 2.8 and 16.5 times more sensitive to debris from apoptotic and necrotic cells respectively, when compared to viable cells. The condition of DNA released into the culture broth was assessed. Necrotic cells released predominantly high molecular weight DNA in contrast to apoptotic cells which released chiefly low molecular weight DNA. The blinding of the filters was found to be largely unaffected by variations in the particle size distribution of material in, and viscosity of, solutions with which they were challenged. The exceptional response of the depth filters to necrotic cells may suggest the cause of previously noted unpredictable filter blinding whereby a number of necrotic cells have a more significant impact on the life of a depth filter than a similar number of viable or apoptotic cells. In a final set of experiments the pressure drop caused by non-viable necrotic culture broths which had been treated with DNase I or benzonase was found to be smaller when compared to untreated broths: the abilities of the enzyme treated cultures to foul the depth filter were reduced by 70.4% and 75.4% respectively indicating the importance of DNA in the blinding of the depth filter studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digestate from the anaerobic digestion conversion process is widely used as a farm land fertiliser. This study proposes an alternative use as a source of energy. Dried digestate was pyrolysed and the resulting oil was blended with waste cooking oil and butanol (10, 20 and 30 vol.%). The physical and chemical properties of the pyrolysis oil blends were measured and compared with pure fossil diesel and waste cooking oil. The blends were tested in a multi-cylinder indirect injection compression ignition engine.Engine combustion, exhaust gas emissions and performance parameters were measured and compared with pure fossil diesel operation. The ASTM copper corrosion values for 20% and 30% pyrolysis blends were 2c, compared to 1b for fossil diesel. The kinematic viscosities of the blends at 40 C were 5–7 times higher than that of fossil diesel. Digested pyrolysis oil blends produced lower in-cylinder peak pressures than fossil diesel and waste cooking oil operation. The maximum heat release rates of the blends were approximately 8% higher than with fossil diesel. The ignition delay periods of the blends were higher; pyrolysis oil blends started to combust late and once combustion started burnt quicker than fossil diesel. The total burning duration of the 20% and 30% blends were decreased by 12% and 3% compared to fossil diesel. At full engine load, the brake thermal efficiencies of the blends were decreased by about 3–7% when compared to fossil diesel. The pyrolysis blends gave lower smoke levels; at full engine load, smoke level of the 20% blend was 44% lower than fossil diesel. In comparison to fossil diesel and at full load, the brake specific fuel consumption (wt.) of the 30% and 20% blends were approximately 32% and 15% higher. At full engine load, the CO emission of the 20% and 30% blends were decreased by 39% and 66% with respect to the fossil diesel. Blends CO2 emissions were similar to that of fossil diesel; at full engine load, 30% blend produced approximately 5% higher CO2 emission than fossil diesel. The study concludes that on the basis of short term engine experiment up to 30% blend of pyrolysis oil from digestate of arable crops can be used in a compression ignition engine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

VSC converters are becoming more prevalent for HVDC applications. Two circuits are commercially available at present, a traditional six-switch, PWM inverter, implemented using series connected IGBTs - ABBs HVDC Light®, and the other a modular multi-level converter (MMC) - Siemens HVDC-PLUS. This paper presents an alternative MMC topology, which utilises a novel current injection technique, and exhibits several desirable characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although maximum power point tracking (MPPT) is crucial in the design of a wind power generation system, the necessary control strategies should also be considered for conditions that require a power reduction, called de-loading in this paper. A coordinated control scheme for a proposed current source converter (CSC) based DC wind energy conversion system is presented in this paper. This scheme combines coordinated control of the pitch angle, a DC load dumping chopper and the DC/DC converter, to quickly achieve wind farm de-loading. MATLAB/Simulink simulations and experiments are used to validate the purpose and effectiveness of the control scheme, both at the same power level. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Removal of dissolved salts and toxic chemicals in water, especially at a few parts per million (ppm) levels is one of the most difficult problems. There are several methods used for water purification. The choice of the method depends mainly on the level of feed water salinity, source of energy and type of contaminants present. Distillation is an age old method which can remove all types of dissolved impurities from contaminated water. In multiple effect distillation (MED) latent heat of steam is recycled several times to produce many units of distilled water with one unit of primary steam input. This is already being used in large capacity plants for treating sea water. But the challenge lies in designing a system for small scale operations that can treat a few cubic meters of water per day, especially suitable for rural communities where the available water is brackish. A small scale MED unit with an extendable number of effects has been designed and analyzed for optimum yield in terms of total distillate produced. © 2010 Elsevier B.V.