60 resultados para Technologies of power
Resumo:
Background and Purpose-Clinical research into the treatment of acute stroke is complicated, is costly, and has often been unsuccessful. Developments in imaging technology based on computed tomography and magnetic resonance imaging scans offer opportunities for screening experimental therapies during phase II testing so as to deliver only the most promising interventions to phase III. We discuss the design and the appropriate sample size for phase II studies in stroke based on lesion volume. Methods-Determination of the relation between analyses of lesion volumes and of neurologic outcomes is illustrated using data from placebo trial patients from the Virtual International Stroke Trials Archive. The size of an effect on lesion volume that would lead to a clinically relevant treatment effect in terms of a measure, such as modified Rankin score (mRS), is found. The sample size to detect that magnitude of effect on lesion volume is then calculated. Simulation is used to evaluate different criteria for proceeding from phase II to phase III. Results-The odds ratios for mRS correspond roughly to the square root of odds ratios for lesion volume, implying that for equivalent power specifications, sample sizes based on lesion volumes should be about one fourth of those based on mRS. Relaxation of power requirements, appropriate for phase II, lead to further sample size reductions. For example, a phase III trial comparing a novel treatment with placebo with a total sample size of 1518 patients might be motivated from a phase II trial of 126 patients comparing the same 2 treatment arms. Discussion-Definitive phase III trials in stroke should aim to demonstrate significant effects of treatment on clinical outcomes. However, more direct outcomes such as lesion volume can be useful in phase II for determining whether such phase III trials should be undertaken in the first place. (Stroke. 2009;40:1347-1352.)
Resumo:
The reduction of indigo (dispersed in water) to leuco-indigo (dissolved in water) is an important industrial process and investigated here for the case of glucose as an environmentally benign reducing agent. In order to quantitatively follow the formation of leuco-indigo two approaches based on (i) rotating disk voltammetry and (ii) sonovoltammetry are developed. Leuco-indigo, once formed in alkaline solution, is readily monitored at a glassy carbon electrode in the mass transport limit employing hydrodynamic voltammetry. The presence of power ultrasound further improves the leuco-indigo determination due to additional agitation and homogenization effects. While inactive at room temperature, glucose readily reduces indigo in alkaline media at 65 degrees C. In the presence of excess glucose, a surface dissolution kinetics limited process is proposed following the rate law d eta(leuco-indigo)/dt = k x c(OH-) x S-indigo where eta(leuco-indigo) is the amount of leuco-indigo formed, k = 4.1 x 10(-9) m s(-1) (at 65 degrees C, assuming spherical particles of I gm diameter) is the heterogeneous dissolution rate constant,c(OH-) is the concentration of hydroxide, and Sindigo is the reactive surface area. The activation energy for this process in aqueous 0.2 M NaOH is E-A = 64 U mol(-1) consistent with a considerable temperature effects. The redox mediator 1,8-dihydroxyanthraquinone is shown to significantly enhance the reaction rate by catalysing the electron transfer between glucose and solid indigo particles. (c) 2006 Elsevier Ltd. All fights reserved.
Resumo:
Purpose – The purpose of this paper is to investigate the concepts of intelligent buildings (IBs), and the opportunities offered by the application of computer-aided facilities management (CAFM) systems. Design/methodology/approach – In this paper definitions of IBs are investigated, particularly definitions that are embracing open standards for effective operational change, using a questionnaire survey. The survey further investigated the extension of CAFM to IBs concepts and the opportunities that such integrated systems will provide to facilities management (FM) professionals. Findings – The results showed variation in the understanding of the concept of IBs and the application of CAFM. The survey showed that 46 per cent of respondents use a CAFM system with a majority agreeing on the potential of CAFM in delivery of effective facilities. Research limitations/implications – The questionnaire survey results are limited to the views of the respondents within the context of FM in the UK. Practical implications – Following on the many definitions of an IB does not necessarily lead to technologies of equipment that conform to an open standard. This open standard and documentation of systems produced by vendors is the key to integrating CAFM with other building management systems (BMS) and further harnessing the application of CAFM for IBs. Originality/value – The paper gives experience-based suggestions for both demand and supply sides of the service procurement to gain the feasible benefits and avoid the currently hindering obstacles, as the paper provides insight to the current and future tools for the mobile aspects of FM. The findings are relevant for service providers and operators as well.
Resumo:
The technologies of metagenomics and metabolomics are broadening our knowledge of the roles the human gut microbiota play in health and disease. For many years now, probiotics and prebiotics have been included in foods for their health benefits; however, we have only recently begun to understand their modes of action. This review highlights recent advances in deciphering the mechanisms of probiosis and prebiosis, and describes how this knowledge could be transferred to select for enhancing functional foods targeting different populations. A special focus will be given to the addition of prebiotics and probiotics in functional foods for infants and seniors.
Resumo:
A multi-layered architecture of self-organizing neural networks is being developed as part of an intelligent alarm processor to analyse a stream of power grid fault messages and provide a suggested diagnosis of the fault location. Feedback concerning the accuracy of the diagnosis is provided by an object-oriented grid simulator which acts as an external supervisor to the learning system. The utilization of artificial neural networks within this environment should result in a powerful generic alarm processor which will not require extensive training by a human expert to produce accurate results.
Resumo:
The authors describe a learning classifier system (LCS) which employs genetic algorithms (GA) for adaptive online diagnosis of power transmission network faults. The system monitors switchgear indications produced by a transmission network, reporting fault diagnoses on any patterns indicative of faulted components. The system evaluates the accuracy of diagnoses via a fault simulator developed by National Grid Co. and adapts to reflect the current network topology by use of genetic algorithms.
Resumo:
This paper addresses the effects of synchronisation errors (time delay, carrier phase, and carrier frequency) on the performance of linear decorrelating detectors (LDDs). A major effect is that all LDDs require certain degree of power control in the presence of synchronisation errors. The multi-shot sliding window algorithm (SLWA) and hard decision method (HDM) are analysed and their power control requirements are examined. Also, a more efficient one-shot detection scheme, called “hard-decision based coupling cancellation”, is proposed and analysed. These schemes are then compared with the isolation bit insertion (IBI) approach in terms of power control requirements.
Resumo:
The precision of quasioptical null-balanced bridge instruments for transmission and reflection coefficient measurements at millimeter and submillimeter wavelengths is analyzed. A Jones matrix analysis is used to describe the amount of power reaching the detector as a function of grid angle orientation, sample transmittance/reflectance and phase delay. An analysis is performed of the errors involved in determining the complex transmission and reflection coefficient after taking into account the quantization error in the grid angle and micrometer readings, the transmission or reflection coefficient of the sample, the noise equivalent power of the detector, the source power and the post-detection bandwidth. For a system fitted with a rotating grid with resolution of 0.017 rad and a micrometer quantization error of 1 μm, a 1 mW source, and a detector with a noise equivalent power 5×10−9 W Hz−1/2, the maximum errors at an amplitude transmission or reflection coefficient of 0.5 are below ±0.025.
Resumo:
The preparation of Community Strategies (CS) has been required of LSPs and Local Authorities in England since the passing of the Local Government Act 2000. This paper examines the process and content of two Community Strategies in southern England as part of an ongoing project to understand their impact and explore ways that CSs may be carried through in a meaningful and effective manner. The paper concludes that the two CSs studied illustrate the challenge faced by LSPs in producing Strategies that are meaningful, inclusive and which follow the spirit of the government CS guidance. LAs and LSPs are also posed with a difficult challenge of seeing through an implicitly required transition from a traditional representative democratic structure/process with a more fluid participatory model. Thus we detect that at least two forms of conflict may arise – firstly with elected councillors threatened by a loss of power and secondly between communities and the LAs who are encouraged to problematise local policy and service delivery in the context of limited resource availability.
Resumo:
The authors discuss an implementation of an object oriented (OO) fault simulator and its use within an adaptive fault diagnostic system. The simulator models the flow of faults around a power network, reporting switchgear indications and protection messages that would be expected in a real fault scenario. The simulator has been used to train an adaptive fault diagnostic system; results and implications are discussed.
Resumo:
Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capture–recapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.
Resumo:
A new electronic software distribution (ESD) life cycle analysis (LCA)methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative,physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO2e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO2e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.
Resumo:
Almost all the electricity currently produced in the UK is generated as part of a centralised power system designed around large fossil fuel or nuclear power stations. This power system is robust and reliable but the efficiency of power generation is low, resulting in large quantities of waste heat. The principal aim of this paper is to investigate an alternative concept: the energy production by small scale generators in close proximity to the energy users, integrated into microgrids. Microgrids—de-centralised electricity generation combined with on-site production of heat—bear the promise of substantial environmental benefits, brought about by a higher energy efficiency and by facilitating the integration of renewable sources such as photovoltaic arrays or wind turbines. By virtue of good match between generation and load, microgrids have a low impact on the electricity network, despite a potentially significant level of generation by intermittent energy sources. The paper discusses the technical and economic issues associated with this novel concept, giving an overview of the generator technologies, the current regulatory framework in the UK, and the barriers that have to be overcome if microgrids are to make a major contribution to the UK energy supply. The focus of this study is a microgrid of domestic users powered by small Combined Heat and Power generators and photovoltaics. Focusing on the energy balance between the generation and load, it is found that the optimum combination of the generators in the microgrid- consisting of around 1.4 kWp PV array per household and 45% household ownership of micro-CHP generators- will maintain energy balance on a yearly basis if supplemented by energy storage of 2.7 kWh per household. We find that there is no fundamental technological reason why microgrids cannot contribute an appreciable part of the UK energy demand. Indeed, an estimate of cost indicates that the microgrids considered in this study would supply electricity at a cost comparable with the present electricity supply if the current support mechanisms for photovoltaics were maintained. Combining photovoltaics and micro-CHP and a small battery requirement gives a microgrid that is independent of the national electricity network. In the short term, this has particular benefits for remote communities but more wide-ranging possibilities open up in the medium to long term. Microgrids could meet the need to replace current generation nuclear and coal fired power stations, greatly reducing the demand on the transmission and distribution network.
Resumo:
The Pax Americana and the grand strategy of hegemony (or “Primacy”) that underpins it may be becoming unsustainable. Particularly in the wake of exhausting wars, the Global Financial Crisis, and the shift of wealth from West to East, it may no longer be possible or prudent for the United States to act as the unipolar sheriff or guardian of a world order. But how viable are the alternatives, and what difficulties will these alternatives entail in their design and execution? This analysis offers a sympathetic but critical analysis of alternative U.S. National Security Strategies of “retrenchment” that critics of American diplomacy offer. In these strategies, the United States would anticipate the coming of a more multipolar world and organize its behavior around the dual principles of “concert” and “balance,” seeking a collaborative relationship with other great powers, while being prepared to counterbalance any hostile aggressor that threatens world order. The proponents of such strategies argue that by scaling back its global military presence and its commitments, the United States can trade prestige for security, shift burdens, and attain a more free hand. To support this theory, they often look to the 19th-century concert of Europe as a model of a successful security regime and to general theories about the natural balancing behavior of states. This monograph examines this precedent and measures its usefulness for contemporary statecraft to identify how great power concerts are sustained and how they break down. The project also applies competing theories to how states might behave if world politics are in transition: Will they balance, bandwagon, or hedge? This demonstrates the multiple possible futures that could shape and be shaped by a new strategy. viii A new strategy based on an acceptance of multipolarity and the limits of power is prudent. There is scope for such a shift. The convergence of several trends—including transnational problems needing collaborative efforts, the military advantages of defenders, the reluctance of states to engage in unbridled competition, and hegemony fatigue among the American people—means that an opportunity exists internationally and at home for a shift to a new strategy. But a Concert-Balance strategy will still need to deal with several potential dilemmas. These include the difficulty of reconciling competitive balancing with cooperative concerts, the limits of balancing without a forward-reaching onshore military capability, possible unanticipated consequences such as a rise in regional power competition or the emergence of blocs (such as a Chinese East Asia or an Iranian Gulf), and the challenge of sustaining domestic political support for a strategy that voluntarily abdicates world leadership. These difficulties can be mitigated, but they must be met with pragmatic and gradual implementation as well as elegant theorizing and the need to avoid swapping one ironclad, doctrinaire grand strategy for another.
Resumo:
This Themed Section aims to increase understanding of how the idea of climate change, and the policies and actions that spring from it, travel beyond their origins in natural sciences to meet different political arenas in the developing world. It takes a discursive approach whereby climate change is not just a set of physical processes but also a series of messages, narratives and policy prescriptions. The articles are mostly case study-based and focus on sub-Saharan Africa and Small Island Developing States (SIDS). They are organised around three interlinked themes. The first theme concerns the processes of rapid technicalisation and professionalisation of the climate change ‘industry’, which have sustantially narrowed the boundaries of what can be viewed as a legitimate social response to the problem of global warming. The second theme deals with the ideological effects of the climate change industry, which is ‘depoliticisation’, in this case the deflection of attention away from underlying political conditions of vulnerability and exploitation towards the nature of the physical hazard itself. The third theme concerns the institutional effects of an insufficiently socialised idea of climate change, which is the maintenance of existing relations of power or their reconfiguration in favour of the already powerful. Overall, the articles suggest that greater scrutiny of the discursive and political dimensions of mitigation and adaptation activities is required. In particular, greater attention should be directed towards the policy consequences that governments and donors construct as a result of their framing and rendition of climate change issues.