871 resultados para Power of attorney


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A novel but simple time-of-flight neutron scattering geometry which allows structural anisotropy to be probed directly, simultaneously and thus unambiguously in polymeric and other materials is described. A particular advantage of the simultaneous data collection when coupled to the large area of the beam is that it enables thin films (< 10 μm < 10 mg) to be studied with relative ease. The utility of the technique is illustrated by studies on both deformed poly(styrene) glasses and on thin films of electrical conducting polymers. In the latter case, the power of isotopic substitution is illustrated to great effect. The development of these procedures for use in other areas of materials science is briefly discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In biological mass spectrometry (MS), two ionization techniques are predominantly employed for the analysis of larger biomolecules, such as polypeptides. These are nano-electrospray ionization [1, 2] (nanoESI) and matrix-assisted laser desorption/ionization [3, 4] (MALDI). Both techniques are considered to be “soft”, allowing the desorption and ionization of intact molecular analyte species and thus their successful mass-spectrometric analysis. One of the main differences between these two ionization techniques lies in their ability to produce multiply charged ions. MALDI typically generates singly charged peptide ions whereas nanoESI easily provides multiply charged ions, even for peptides as low as 1000 Da in mass. The production of highly charged ions is desirable as this allows the use of mass analyzers, such as ion traps (including orbitraps) and hybrid quadrupole instruments, which typically offer only a limited m/z range (< 2000–4000). It also enables more informative fragmentation spectra using techniques such as collisioninduced dissociation (CID) and electron capture/transfer dissociation (ECD/ETD) in combination with tandem MS (MS/MS). [5, 6] Thus, there is a clear advantage of using ESI in research areas where peptide sequencing, or in general, the structural elucidation of biomolecules by MS/MS is required. Nonetheless, MALDI with its higher tolerance to contaminants and additives, ease-of-operation, potential for highspeed and automated sample preparation and analysis as well as its MS imaging capabilities makes it an ionization technique that can cover bioanalytical areas for which ESI is less suitable. [7, 8] If these strengths could be combined with the analytical power of multiply charged ions, new instrumental configurations and large-scale proteomic analyses based on MALDI MS(/MS) would become feasible.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The current state of the art and direction of research in computer vision aimed at automating the analysis of CCTV images is presented. This includes low level identification of objects within the field of view of cameras, following those objects over time and between cameras, and the interpretation of those objects’ appearance and movements with respect to models of behaviour (and therefore intentions inferred). The potential ethical problems (and some potential opportunities) such developments may pose if and when deployed in the real world are presented, and suggestions made as to the necessary new regulations which will be needed if such systems are not to further enhance the power of the surveillers against the surveilled.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study a two-way relay network (TWRN), where distributed space-time codes are constructed across multiple relay terminals in an amplify-and-forward mode. Each relay transmits a scaled linear combination of its received symbols and their conjugates,with the scaling factor chosen based on automatic gain control. We consider equal power allocation (EPA) across the relays, as well as the optimal power allocation (OPA) strategy given access to instantaneous channel state information (CSI). For EPA, we derive an upper bound on the pairwise-error-probability (PEP), from which we prove that full diversity is achieved in TWRNs. This result is in contrast to one-way relay networks, in which case a maximum diversity order of only unity can be obtained. When instantaneous CSI is available at the relays, we show that the OPA which minimizes the conditional PEP of the worse link can be cast as a generalized linear fractional program, which can be solved efficiently using the Dinkelback-type procedure.We also prove that, if the sum-power of the relay terminals is constrained, then the OPA will activate at most two relays.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Quantile forecasts are central to risk management decisions because of the widespread use of Value-at-Risk. A quantile forecast is the product of two factors: the model used to forecast volatility, and the method of computing quantiles from the volatility forecasts. In this paper we calculate and evaluate quantile forecasts of the daily exchange rate returns of five currencies. The forecasting models that have been used in recent analyses of the predictability of daily realized volatility permit a comparison of the predictive power of different measures of intraday variation and intraday returns in forecasting exchange rate variability. The methods of computing quantile forecasts include making distributional assumptions for future daily returns as well as using the empirical distribution of predicted standardized returns with both rolling and recursive samples. Our main findings are that the Heterogenous Autoregressive model provides more accurate volatility and quantile forecasts for currencies which experience shifts in volatility, such as the Canadian dollar, and that the use of the empirical distribution to calculate quantiles can improve forecasts when there are shifts

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Climate sensitivity is defined as the change in global mean equilibrium temperature after a doubling of atmospheric CO2 concentration and provides a simple measure of global warming. An early estimate of climate sensitivity, 1.5—4.5°C, has changed little subsequently, including the latest assessment by the Intergovernmental Panel on Climate Change. The persistence of such large uncertainties in this simple measure casts doubt on our understanding of the mechanisms of climate change and our ability to predict the response of the climate system to future perturbations. This has motivated continued attempts to constrain the range with climate data, alone or in conjunction with models. The majority of studies use data from the instrumental period (post-1850), but recent work has made use of information about the large climate changes experienced in the geological past. In this review, we first outline approaches that estimate climate sensitivity using instrumental climate observations and then summarize attempts to use the record of climate change on geological timescales. We examine the limitations of these studies and suggest ways in which the power of the palaeoclimate record could be better used to reduce uncertainties in our predictions of climate sensitivity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper employs an extensive Monte Carlo study to test the size and power of the BDS and close return methods of testing for departures from independent and identical distribution. It is found that the finite sample properties of the BDS test are far superior and that the close return method cannot be recommended as a model diagnostic. Neither test can be reliably used for very small samples, while the close return test has low power even at large sample sizes

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Global warming has attracted attention from all over the world and led to the concern about carbon emission. Kyoto Protocol, as the first major international regulatory emission trading scheme, was introduced in 1997 and outlined the strategies for reducing carbon emission (Ratnatunga et al., 2011). As the increased interest in carbon reduction the Protocol came into force in 2005, currently there are already 191 nations ratifying the Protocol(UNFCCC, 2012). Under the cap-and-trade schemes, each company has its carbon emission target. When company’s carbon emission exceeds the target the company will either face fines or buy emission allowance from other companies. Thus unlike most of the other social and environmental issues carbon emission could trigger cost for companies in introducing low-emission equipment and systems and also emission allowance cost when they emit more than their targets. Despite the importance of carbon emission to companies, carbon emission reporting is still operating under unregulated environment and companies are only required to disclose when it is material either in value or in substances (Miller, 2005, Deegan and Rankin, 1997). Even though there is still an increase in the volume of carbon emission disclosures in company’s financial reports and stand-alone social and environmental reports to show their concern of the environment and also their social responsibility (Peters and Romi, 2009), the motivations behind corporate carbon emission disclosures and whether carbon disclosures have impact on corporate environmental reputation and financial performance have not yet to explore. The problems with carbon emission lie on both the financial side and non-financial side of corporate governance. On one hand corporate needs to spend money in reducing carbon emission or paying penalties when they emit more than allowed. On the other hand as the public are more interested in environmental issues than before carbon emission could also impact on the image of corporate regarding to its environmental performance. The importance of carbon emission issue are beginning to be recognized by companies from different industries as one of the critical issues in supply chain management (Lee, 2011) and 80% of companies analysed are facing carbon risks resulting from emissions in the companies’ supply chain as shown in a study conducted by the Investor Responsibility Research Centre Institute for Corporate Responsibility (IRRCI) and over 80% of the companies analysed found that the majority of greenhouse gas (GHG) emission are from electricity and other direct suppliers (Trucost, 2009). The review of extant literature shows the increased importance of carbon emission issues and the gap in the study of carbon reporting and disclosures and also the study which links corporate environmental reputation and corporate financial performance with carbon reporting (Lohmann, 2009a, Ratnatunga and Balachandran, 2009, Bebbington and Larrinaga-Gonzalez, 2008). This study would focus on investigating the current status of UK carbon emission disclosures, the determinant factors of corporate carbon disclosure, and the relationship between carbon emission disclosures and corporate environmental reputation and financial performance of UK listed companies from 2004-2012 and explore the explanatory power of classical disclosure theories.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This multiple case-based study investigates the relationship between recruiting agents and the UK universities who act as their principals. The current extensive use of agents in UK higher education may be seen as an indicator of the financial impact made by international students. The study analyses the practice of agent management and explores the manner in which power and control interact. The study employed semi-structured interviews and group discussions involving up to 6 respondents from each of the 20 UK case institutions. The qualitative data reveal a considerable variation in the manner in which the universities manage their agency relationships. Through the joint consideration of control measures and use of power, five distinctive approaches have been identified. The study also reveals that over-dependence on agents reduces the power of the principal, and consequently, the principal’s ability to exercise control, particularly in highly competitive global and national markets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article examines a little known decision of the Judicial Committee of the Privy Council: Grand Trunk Railway Company of Canada v Robinson (1915). The examination is historical and it provides a different insight into the understanding of privity of contract, a doctrine central to contract law. The examination reveals a process of trans-Atlantic legal migration in which English law was applied to resolve an Ontario case. The nature of the resolution is surprising because it appears to conflict with the better known decision of the House of Lords, Dunlop Pneumatic Tyre Company, Limited v Selfridge and Company, Limited, which a similarly constituted panel delivered in the same week. This article argues that there was a greater malleability in the resolution of cases concerned with privity than was thought to have existed. It is also argued that the power of Canadian railway capitalism is a significant factor in understanding the legal resolution of the case. Finally, it the article considers the use of English and American precedents relevant to the case. The application of English precedents to the case led to a resolution not entirely befitting Canadian conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article examines the role played by ideas and their thinkers in Christopher Hill's histories of the English Revolution. Hill protested against a reductionist economic determinism with no place for the intrinsic power of ideas, but his account of ideas gave them a progressive logic parallel to, if not always easy to link with, that of economic development, and threatened to divorce them from their muddled and imperfect thinkers. This account of the logic of ideas had a striking impact on the way in which the more mainstream radicals of the English Revolution appeared in Hill's work, with both the Levellers and James Harrington being half assimilated to, and half pushed aside in favor of, the more thoroughgoing economic radicals who expressed, in however ragged a way, the intrinsic potential of their ideas. However, Hill's writings also betray a surprising attraction to religious over secular forms of radicalism.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials in which an experimental treatment is selected at an interim analysis have been the focus of much recent research interest. Many of the methods proposed are based on the group sequential approach. This paper considers designs of this type in which the treatment selection can be based on short-term endpoint information for more patients than have primary endpoint data available. We show that in such a case, the familywise type I error rate may be inflated if previously proposed group sequential methods are used and the treatment selection rule is not specified in advance. A method is proposed to avoid this inflation by considering the treatment selection that maximises the conditional error given the data available at the interim analysis. A simulation study is reported that illustrates the type I error rate inflation and compares the power of the new approach with two other methods: a combination testing approach and a group sequential method that does not use the short-term endpoint data, both of which also strongly control the type I error rate. The new method is also illustrated through application to a study in Alzheimer's disease. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Forecasting wind power is an important part of a successful integration of wind power into the power grid. Forecasts with lead times longer than 6 h are generally made by using statistical methods to post-process forecasts from numerical weather prediction systems. Two major problems that complicate this approach are the non-linear relationship between wind speed and power production and the limited range of power production between zero and nominal power of the turbine. In practice, these problems are often tackled by using non-linear non-parametric regression models. However, such an approach ignores valuable and readily available information: the power curve of the turbine's manufacturer. Much of the non-linearity can be directly accounted for by transforming the observed power production into wind speed via the inverse power curve so that simpler linear regression models can be used. Furthermore, the fact that the transformed power production has a limited range can be taken care of by employing censored regression models. In this study, we evaluate quantile forecasts from a range of methods: (i) using parametric and non-parametric models, (ii) with and without the proposed inverse power curve transformation and (iii) with and without censoring. The results show that with our inverse (power-to-wind) transformation, simpler linear regression models with censoring perform equally or better than non-linear models with or without the frequently used wind-to-power transformation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An equation of Monge-Ampère type has, for the first time, been solved numerically on the surface of the sphere in order to generate optimally transported (OT) meshes, equidistributed with respect to a monitor function. Optimal transport generates meshes that keep the same connectivity as the original mesh, making them suitable for r-adaptive simulations, in which the equations of motion can be solved in a moving frame of reference in order to avoid mapping the solution between old and new meshes and to avoid load balancing problems on parallel computers. The semi-implicit solution of the Monge-Ampère type equation involves a new linearisation of the Hessian term, and exponential maps are used to map from old to new meshes on the sphere. The determinant of the Hessian is evaluated as the change in volume between old and new mesh cells, rather than using numerical approximations to the gradients. OT meshes are generated to compare with centroidal Voronoi tesselations on the sphere and are found to have advantages and disadvantages; OT equidistribution is more accurate, the number of iterations to convergence is independent of the mesh size, face skewness is reduced and the connectivity does not change. However anisotropy is higher and the OT meshes are non-orthogonal. It is shown that optimal transport on the sphere leads to meshes that do not tangle. However, tangling can be introduced by numerical errors in calculating the gradient of the mesh potential. Methods for alleviating this problem are explored. Finally, OT meshes are generated using observed precipitation as a monitor function, in order to demonstrate the potential power of the technique.