922 resultados para high power drives for trolleybus systems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been recently found that a number of systems displaying crackling noise also show a remarkable behavior regarding the temporal occurrence of successive events versus their size: a scaling law for the probability distributions of waiting times as a function of a minimum size is fulfilled, signaling the existence on those systems of self-similarity in time-size. This property is also present in some non-crackling systems. Here, the uncommon character of the scaling law is illustrated with simple marked renewal processes, built by definition with no correlations. Whereas processes with a finite mean waiting time do not fulfill a scaling law in general and tend towards a Poisson process in the limit of very high sizes, processes without a finite mean tend to another class of distributions, characterized by double power-law waiting-time densities. This is somehow reminiscent of the generalized central limit theorem. A model with short-range correlations is not able to escape from the attraction of those limit distributions. A discussion on open problems in the modeling of these properties is provided.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The report presents a grammar capable of analyzing the process of production of electricity in modular elements for different power-supply systems, defined using semantic and formal categories. In this way it becomes possible to individuate similarities and differences in the process of production of electricity, and then measure and compare “apples” with “apples” and “oranges” with “oranges”. For instance, when comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. As a matter of facts, the performance of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. By adopting this approach, it becomes possible to compare the performance of the two power-supply systems by comparing their relative biophysical requirements for the phases that both nuclear energy power plants and fossil energy power plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. This report presents the evaluation of the biophysical requirements for the two powersupply systems: nuclear energy and fossil energy. In particular, the report focuses on the following requirements: (i) electricity; (ii) fossil-fuels, (iii) labor; and (iv) materials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The electromagnetic radiation at a terahertz frequencies (from 0.1 THz to 10 THz) is situated in the frequency band comprised between the optical band and the radio band. The interest of the scientific community in this frequency band has grown up due to its large capabilities to develop innovative imaging systems. The terahertz waves are able to generate extremely short pulses that achieve good spatial resolution, good penetration capabilities and allow to identify microscopic structures using spectral analysis. The work carried out during the period of the grant has been based on the developement of system working at the aforementioned frequency band. The main system is based on a total power radiometer working at 0.1 THz to perform security imaging. Moreover, the development of this system has been useful to gain knowledge in the behavior of the component systems at this frequency band. Moreover, a vectorial network analyzer has been used to characterize materials and perform active raster imaging. A materials measurement system has been designed and used to measure material properties as permittivity, losses and water concentration. Finally, the design of a terahertz time-domain spectrometer (THz-TDS) system has been started. This system will allow to perform tomographic measurement with very high penetration resolutions while allowing the spectral characterization of the sample material. The application range of this kind of system is very wide: from the identification of cancerous tissues of a skin to the characterization of the thickness of a painted surface of a car.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This Technical Report presents a tentative protocol used to assess the viability of powersupply systems. The viability of power-supply systems can be assessed by looking at the production factors (e.g. paid labor, power capacity, fossil-fuels) – needed for the system to operate and maintain itself – in relation to the internal constraints set by the energetic metabolism of societies. In fact, by using this protocol it becomes possible to link assessments of technical coefficients performed at the level of the power-supply systems with assessments of benchmark values performed at the societal level throughout the relevant different sectors. In particular, the example provided here in the case of France for the year 2009 makes it possible to see that in fact nuclear energy is not viable in terms of labor requirements (both direct and indirect inputs) as well as in terms of requirements of power capacity, especially when including reprocessing operations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper focus on the problem of locating single-phase faults in mixed distribution electric systems, with overhead lines and underground cables, using voltage and current measurements at the sending-end and sequence model of the network. Since calculating series impedance for underground cables is not as simple as in the case of overhead lines, the paper proposes a methodology to obtain an estimation of zero-sequence impedance of underground cables starting from previous single-faults occurred in the system, in which an electric arc occurred at the fault location. For this reason, the signal is previously pretreated to eliminate its peaks voltage and the analysis can be done working with a signal as close as a sinus wave as possible

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Through the history of Electrical Engineering education, vectorial and phasorial diagrams have been used as a fundamental learning tool. At present, computational power has replaced them by long data lists, the result of solving equation systems by means of numerical methods. In this sense, diagrams have been shifted to an academic background and although theoretically explained, they are not used in a practical way within specific examples. This fact may be against the understanding of the complex behavior of the electrical power systems by students. This article proposes a modification of the classical Perrine-Baum diagram construction to allowing both a more practical representation and a better understanding of the behavior of a high-voltage electric line under different levels of load. This modification allows, at the same time, the forecast of the obsolescence of this behavior and line’s loading capacity. Complementary, we evaluate the impact of this tool in the learning process showing comparative undergraduate results during three academic years

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The analysis of the multiantenna capacity in the high-SNR regime has hitherto focused on the high-SNR slope (or maximum multiplexing gain), which quantifies the multiplicative increase as function of the number of antennas. This traditional characterization is unable to assess the impact of prominent channel features since, for a majority of channels, the slope equals the minimum of the number of transmit and receive antennas. Furthermore, a characterization based solely on the slope captures only the scaling but it has no notion of the power required for a certain capacity. This paper advocates a more refined characterization whereby, as function of SNRjdB, the high-SNR capacity is expanded as an affine function where the impact of channel features such as antenna correlation, unfaded components, etc, resides in the zero-order term or power offset. The power offset, for which we find insightful closed-form expressions, is shown to play a chief role for SNR levels of practical interest.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The MAGIC collaboration has searched for high-energy gamma-ray emission of some of the most promising pulsar candidates above an energy threshold of 50 GeV, an energy not reachable up to now by other ground-based instruments. Neither pulsed nor steady gamma-ray emission has been observed at energies of 100 GeV from the classical radio pulsars PSR J0205+6449 and PSR J2229+6114 (and their nebulae 3C58 and Boomerang, respectively) and the millisecond pulsar PSR J0218+4232. Here, we present the flux upper limits for these sources and discuss their implications in the context of current model predictions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

No-tillage systems, associated to black oat as preceding cover crop, have been increasingly adopted. This has motivated anticipated maize nitrogen fertilization, transferring it from the side-dress system at the stage when plants have five to six expanded leaves to when the preceding cover crop is eliminated or to maize sowing. This study was conducted to evaluate the effects of soil tillage system and timing of N fertilization on maize grain yield and agronomic efficiency of N applied to a soil with high organic matter content. A three-year field experiment was conducted in Lages, state of Santa Catarina, from 1999 onwards. Treatments were set up in a split plot arrangement. Two soil tillage systems were tested in the main plots: conventional tillage (CT) and no-tillage (NT). Six N management systems were assessed in the split-plots: S1 - control, without N application; S2 - all N (100 kg ha-1) applied at oat desiccation; S3 - all N applied at maize sowing; S4 - all N side-dressed when maize had five expanded leaves (V5 growth stage); S5 - 1/3 of N rate applied at maize sowing and 2/3 at V5; S6 - 2/3 of nitrogen rate applied at maize sowing and 1/3 at V5. Maize response to the time and form of splitting N was not affected by the soil tillage system. Grain yield ranged from 6.0 to 11.8 t ha-1. The anticipation of N application (S2 and S3) decreased grain yield in two of three years. In the rainiest early spring season (2000/2001) of the experiment, S4 promoted an yield advantage of 2.2 t ha-1 over S2 and S3. Application of total N rate before or at sowing decreased the number of kernels produced per ear in 2000/2001 and 2001/2002 and the number of ears produced per area in 2001/2002, resulting in reduced grain yield. The agronomic efficiency of applied N (kg grain increase/kg of N applied) ranged from 13.9 to 38.8 and was always higher in the S4 than in the S2 and S3 N systems. Short-term N immobilization did not reduce grain yield when no N was applied before or at maize sowing in a soil with high organic matter content, regardless of the soil tillage system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this brief is to present an original design methodology that permits implementing latch-up-free smart power circuits on a very simple, cost-effective technology. The basic concept used for this purpose is letting float the wells of the MOS transistors most susceptible to initiate latch-up.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of the present study was to establish and compare the durations of the seminiferous epithelium cycles of the common shrew Sorex araneus, which is characterized by a high metabolic rate and multiple paternity, and the greater white-toothed shrew Crocidura russula, which is characterized by a low metabolic rate and a monogamous mating system. Twelve S. araneus males and fifteen C. russula males were injected intraperitoneally with 5-bromodeoxyuridine, and the testes were collected. For cycle length determinations, we applied the classical method of estimation and linear regression as a new method. With regard to variance, and even with a relatively small sample size, the new method seems to be more precise. In addition, the regression method allows the inference of information for every animal tested, enabling comparisons of different factors with cycle lengths. Our results show that not only increased testis size leads to increased sperm production, but it also reduces the duration of spermatogenesis. The calculated cycle lengths were 8.35 days for S. araneus and 12.12 days for C. russula. The data obtained in the present study provide the basis for future investigations into the effects of metabolic rate and mating systems on the speed of spermatogenesis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

High availability goals and ways to reach them, from a GNU/Linux point of view.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Summary This dissertation explores how stakeholder dialogue influences corporate processes, and speculates about the potential of this phenomenon - particularly with actors, like non-governmental organizations (NGOs) and other representatives of civil society, which have received growing attention against a backdrop of increasing globalisation and which have often been cast in an adversarial light by firms - as a source of teaming and a spark for innovation in the firm. The study is set within the context of the introduction of genetically-modified organisms (GMOs) in Europe. Its significance lies in the fact that scientific developments and new technologies are being generated at an unprecedented rate in an era where civil society is becoming more informed, more reflexive, and more active in facilitating or blocking such new developments, which could have the potential to trigger widespread changes in economies, attitudes, and lifestyles, and address global problems like poverty, hunger, climate change, and environmental degradation. In the 1990s, companies using biotechnology to develop and offer novel products began to experience increasing pressure from civil society to disclose information about the risks associated with the use of biotechnology and GMOs, in particular. Although no harmful effects for humans or the environment have been factually demonstrated even to date (2008), this technology remains highly-contested and its introduction in Europe catalysed major companies to invest significant financial and human resources in stakeholder dialogue. A relatively new phenomenon at the time, with little theoretical backing, dialogue was seen to reflect a move towards greater engagement with stakeholders, commonly defined as those "individuals or groups with which. business interacts who have a 'stake', or vested interest in the firm" (Carroll, 1993:22) with whom firms are seen to be inextricably embedded (Andriof & Waddock, 2002). Regarding the organisation of this dissertation, Chapter 1 (Introduction) describes the context of the study, elaborates its significance for academics and business practitioners as an empirical work embedded in a sector at the heart of the debate on corporate social responsibility (CSR). Chapter 2 (Literature Review) traces the roots and evolution of CSR, drawing on Stakeholder Theory, Institutional Theory, Resource Dependence Theory, and Organisational Learning to establish what has already been developed in the literature regarding the stakeholder concept, motivations for engagement with stakeholders, the corporate response to external constituencies, and outcomes for the firm in terms of organisational learning and change. I used this review of the literature to guide my inquiry and to develop the key constructs through which I viewed the empirical data that was gathered. In this respect, concepts related to how the firm views itself (as a victim, follower, leader), how stakeholders are viewed (as a source of pressure and/or threat; as an asset: current and future), corporate responses (in the form of buffering, bridging, boundary redefinition), and types of organisational teaming (single-loop, double-loop, triple-loop) and change (first order, second order, third order) were particularly important in building the key constructs of the conceptual model that emerged from the analysis of the data. Chapter 3 (Methodology) describes the methodology that was used to conduct the study, affirms the appropriateness of the case study method in addressing the research question, and describes the procedures for collecting and analysing the data. Data collection took place in two phases -extending from August 1999 to October 2000, and from May to December 2001, which functioned as `snapshots' in time of the three companies under study. The data was systematically analysed and coded using ATLAS/ti, a qualitative data analysis tool, which enabled me to sort, organise, and reduce the data into a manageable form. Chapter 4 (Data Analysis) contains the three cases that were developed (anonymised as Pioneer, Helvetica, and Viking). Each case is presented in its entirety (constituting a `within case' analysis), followed by a 'cross-case' analysis, backed up by extensive verbatim evidence. Chapter 5 presents the research findings, outlines the study's limitations, describes managerial implications, and offers suggestions for where more research could elaborate the conceptual model developed through this study, as well as suggestions for additional research in areas where managerial implications were outlined. References and Appendices are included at the end. This dissertation results in the construction and description of a conceptual model, grounded in the empirical data and tied to existing literature, which portrays a set of elements and relationships deemed important for understanding the impact of stakeholder engagement for firms in terms of organisational learning and change. This model suggests that corporate perceptions about the nature of stakeholder influence the perceived value of stakeholder contributions. When stakeholders are primarily viewed as a source of pressure or threat, firms tend to adopt a reactive/defensive posture in an effort to manage stakeholders and protect the firm from sources of outside pressure -behaviour consistent with Resource Dependence Theory, which suggests that firms try to get control over extemal threats by focussing on the relevant stakeholders on whom they depend for critical resources, and try to reverse the control potentially exerted by extemal constituencies by trying to influence and manipulate these valuable stakeholders. In situations where stakeholders are viewed as a current strategic asset, firms tend to adopt a proactive/offensive posture in an effort to tap stakeholder contributions and connect the organisation to its environment - behaviour consistent with Institutional Theory, which suggests that firms try to ensure the continuing license to operate by internalising external expectations. In instances where stakeholders are viewed as a source of future value, firms tend to adopt an interactive/innovative posture in an effort to reduce or widen the embedded system and bring stakeholders into systems of innovation and feedback -behaviour consistent with the literature on Organisational Learning, which suggests that firms can learn how to optimize their performance as they develop systems and structures that are more adaptable and responsive to change The conceptual model moreover suggests that the perceived value of stakeholder contribution drives corporate aims for engagement, which can be usefully categorised as dialogue intentions spanning a continuum running from low-level to high-level to very-high level. This study suggests that activities aimed at disarming critical stakeholders (`manipulation') providing guidance and correcting misinformation (`education'), being transparent about corporate activities and policies (`information'), alleviating stakeholder concerns (`placation'), and accessing stakeholder opinion ('consultation') represent low-level dialogue intentions and are experienced by stakeholders as asymmetrical, persuasive, compliance-gaining activities that are not in line with `true' dialogue. This study also finds evidence that activities aimed at redistributing power ('partnership'), involving stakeholders in internal corporate processes (`participation'), and demonstrating corporate responsibility (`stewardship') reflect high-level dialogue intentions. This study additionally finds evidence that building and sustaining high-quality, trusted relationships which can meaningfully influence organisational policies incline a firm towards the type of interactive, proactive processes that underpin the development of sustainable corporate strategies. Dialogue intentions are related to type of corporate response: low-level intentions can lead to buffering strategies; high-level intentions can underpin bridging strategies; very high-level intentions can incline a firm towards boundary redefinition. The nature of corporate response (which encapsulates a firm's posture towards stakeholders, demonstrated by the level of dialogue intention and the firm's strategy for dealing with stakeholders) favours the type of learning and change experienced by the organisation. This study indicates that buffering strategies, where the firm attempts to protect itself against external influences and cant' out its existing strategy, typically lead to single-loop learning, whereby the firm teams how to perform better within its existing paradigm and at most, improves the performance of the established system - an outcome associated with first-order change. Bridging responses, where the firm adapts organisational activities to meet external expectations, typically leads a firm to acquire new behavioural capacities characteristic of double-loop learning, whereby insights and understanding are uncovered that are fundamentally different from existing knowledge and where stakeholders are brought into problem-solving conversations that enable them to influence corporate decision-making to address shortcomings in the system - an outcome associated with second-order change. Boundary redefinition suggests that the firm engages in triple-loop learning, where the firm changes relations with stakeholders in profound ways, considers problems from a whole-system perspective, examining the deep structures that sustain the system, producing innovation to address chronic problems and develop new opportunities - an outcome associated with third-order change. This study supports earlier theoretical and empirical studies {e.g. Weick's (1979, 1985) work on self-enactment; Maitlis & Lawrence's (2007) and Maitlis' (2005) work and Weick et al's (2005) work on sensegiving and sensemaking in organisations; Brickson's (2005, 2007) and Scott & Lane's (2000) work on organisational identity orientation}, which indicate that corporate self-perception is a key underlying factor driving the dynamics of organisational teaming and change. Such theorizing has important implications for managerial practice; namely, that a company which perceives itself as a 'victim' may be highly inclined to view stakeholders as a source of negative influence, and would therefore be potentially unable to benefit from the positive influence of engagement. Such a selfperception can blind the firm from seeing stakeholders in a more positive, contributing light, which suggests that such firms may not be inclined to embrace external sources of innovation and teaming, as they are focussed on protecting the firm against disturbing environmental influences (through buffering), and remain more likely to perform better within an existing paradigm (single-loop teaming). By contrast, a company that perceives itself as a 'leader' may be highly inclined to view stakeholders as a source of positive influence. On the downside, such a firm might have difficulty distinguishing when stakeholder contributions are less pertinent as it is deliberately more open to elements in operating environment (including stakeholders) as potential sources of learning and change, as the firm is oriented towards creating space for fundamental change (through boundary redefinition), opening issues to entirely new ways of thinking and addressing issues from whole-system perspective. A significant implication of this study is that potentially only those companies who see themselves as a leader are ultimately able to tap the innovation potential of stakeholder dialogue.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we investigate the average andoutage performance of spatial multiplexing multiple-input multiple-output (MIMO) systems with channel state information at both sides of the link. Such systems result, for example, from exploiting the channel eigenmodes in multiantenna systems. Dueto the complexity of obtaining the exact expression for the average bit error rate (BER) and the outage probability, we deriveapproximations in the high signal-to-noise ratio (SNR) regime assuming an uncorrelated Rayleigh flat-fading channel. Moreexactly, capitalizing on previous work by Wang and Giannakis, the average BER and outage probability versus SNR curves ofspatial multiplexing MIMO systems are characterized in terms of two key parameters: the array gain and the diversity gain. Finally, these results are applied to analyze the performance of a variety of linear MIMO transceiver designs available in the literature.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a Bayesian approach to the design of transmit prefiltering matrices in closed-loop schemes robust to channel estimation errors. The algorithms are derived for a multiple-input multiple-output (MIMO) orthogonal frequency division multiplexing (OFDM) system. Two different optimizationcriteria are analyzed: the minimization of the mean square error and the minimization of the bit error rate. In both cases, the transmitter design is based on the singular value decomposition (SVD) of the conditional mean of the channel response, given the channel estimate. The performance of the proposed algorithms is analyzed,and their relationship with existing algorithms is indicated. As withother previously proposed solutions, the minimum bit error rate algorithmconverges to the open-loop transmission scheme for very poor CSI estimates.