987 resultados para Andrew Tooke
Resumo:
A Tomografia de Coerência Ótica (TCO) é uma nova tecnologia de imagem baseada em interferometria de baixa coerência que utiliza a dispersão de luz quase-infravermelha como uma fonte de sinal para fornecer imagens transversais vasculares com definição muito superior à de qualquer outra modalidade disponível. Com uma resolução espacial de até 10μm, a TCO fornece uma resolução 20 vezes maior do que o ultrassom intravascular (USIV), a modalidade atualmente mais utilizada para obter imagens intra-coronárias. A TCO tem uma capacidade de fornecer um entendimento das várias fases da doença aterosclerótica e a resposta vascular ao tratamento. Estudos tem mostrado a capacidade da TCO em detectar estruturas arteriais e ajudar na determinação de diferentes constituintes histológicos. Sua capacidade de distinguir diferentes graus de alterações ateroscleróticas e os vários tipos de placas, quando comparada à histologia, tem sido recentemente demonstrada com correlações inter e intra-observador aceitáveis para esses achados. A TCO fornece uma resolução endovascular excepcional em tempo real in vivo, que tem sido explorada para avaliar as estruturas vasculares e a resposta ao uso do equipamento. Embora a profundidade permaneça uma limitação para a caracterização de placa além de 2 mm através da TCO, uma resolução próxima à histológica pode ser obtida dentro do primeiro milímetro da parede do vaso, permitindo uma avaliação extraordinária das características e espessura da capa fibrosa. Além disso, a avaliação da cobertura de neoíntima, padrões de tecido para-haste e aposição de stent podem agora ser escrutinizados para hastes individuais na escala de mícrons, a assim chamada análise em nível de haste. A TCO levou a imagem intravascular ao nível de mícron na análise vascular in vivo e espera-se que breve se torne uma ferramenta valiosa e indispensável para cardiologistas em aplicações clínicas e de pesquisa.
Resumo:
Climate change is a crisis that is going to affect all of our lives in the future. Ireland is expected to have increased storms and rain throughout the country. This will affect our lives greatly unless we do something to change it. In an attempt to try and reduce the impacts of climate change, countries across the world met to address the problem. The meeting became known as the Kyoto Protocol. The Kyoto protocol set out objectives for each developed country to achieve with regards to carbon emissions to the same levels as 1990 levels. Due to the economy in Ireland being at a low point in 1990, Ireland was given a target of 13% carbon emissions above 1990 levels. In order to meet targets Ireland produced two energy papers, the green paper and the white paper. The green paper identified drivers for energy management and control; they were security of energy supply, economic competitiveness and environmental protection. The white paper produced targets in which we should aim to achieve to try and address the green papers drivers. Within the targets was the plan to reduce energy consumption in the public sector by 33% by 2020 through energy conservation measures. Schools are part of the public sector that has targets to reduce its energy consumption. To help to achieve targets in schools initiatives have been developed by the government for schools. Energy audits should be performed in order to identify areas where the schools can improve their current trends and show where they can invest in the future to save money and reduce the schools overall environmental footprint. Grants are available for the schools for insulation through the energy efficiency scheme and for renewable energy technologies through the ReHeat scheme. The promotion of energy efficient programs in schools can have a positive effect for students to have an understanding. The Display Energy Certificate is a legal document that can be used to understand how each school is performing from an energy perspective. It can help schools to understand why they need to change their current energy management structure. By improving the energy management of the schools they then improve the performance on the Display Energy Certificate. Schools should use these tools wisely and take advantage of the grants available which can in the short to long term help them to save money and reduce their carbon footprint.
Resumo:
This thesis describes a search for very high energy (VHE) gamma-ray emission from the starburst galaxy IC 342. The analysis was based on data from the 2003 — 2004 observing season recorded using the Whipple 10-metre imaging atmospheric Cherenkov telescope located on Mount Hopkins in southern Arizona. IC 342 may be classed as a non-blazar type galaxy and to date only a few such galaxies (M 87, Cen A, M 82 and NGC 253) have been detected as VHE gamma-ray sources. Analysis of approximately 24 hours of good quality IC 342 data, consisting entirely of ON/OFF observations, was carried out using a number of methods (standard Supercuts, optimised Supercuts, scaled optimised Supercuts and the multivariate kernel analysis technique). No evidence for TeV gamma-ray emission from IC 342 was found. The significance was 0.6 a with a nominal rate of 0.04 ± 0.06 gamma rays per minute. The flux upper limit above 600 GeV (at 99.9 % confidence) was determined to be 5.5 x 10-8 m-2 s-1, corresponding to 8 % of the Crab Nebula flux in the same energy range.
Resumo:
1912:May
Resumo:
O método micro Bailey-Andrew, oficial da A. O. A, C. (1970) e o proposto por NAVELLIER et al. (1969) para determinação de cafeína em café, foram comparados. Para extração e purificação seguiu-se os dois métodos citados e para a determinação propriamente dita utilizou-se o peso, nitrogênio total, espectrofotometria com correção da linha de base e o pico à 273 nm. As seguintes conclusões foram tiradas: a) A determinação por pesagem não deve ser utilizada para nenhum dos dois métodos de extração e purificação, pois além dos valores encontrados serem bem acima dos reais, os coeficientes de variação também foram altos. b) O método de extração e purificação de NAVELLIER et al. (1969) não consegue remover interferentes nitrogenados, mas talvez possa ser utilizado quando a cafeína é determinada por espectrofotometria com correção da linha de base. c) Não houve diferença significativa entre a determinação da cafeína pelo nitrogênio e espectrofotometria com correlação da linha de base quando se seguiu a extração e purificação da A, O. A. C. (1970).
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt
Resumo:
Let A be a simple, unital, finite, and exact C*-algebra which absorbs the Jiang-Su algebra Z tensorially. We prove that the Cuntz semigroup of A admits a complete order embedding into an ordered semigroup which is obtained from the Elliott invariant in a functorial manner. We conjecture that this embedding is an isomor phism, and prove the conjecture in several cases. In these same cases - Z-stable algebras all - we prove that the Elliott conjecture in its strongest form is equivalent to a conjecture which appears much weaker. Outside the class of Z-stable C*-algebras, this weaker conjecture has no known counterexamples, and it is plausible that none exist. Thus, we reconcile the still intact principle of Elliott's classification conjecture -that K-theoretic invariants will classify separable and nuclear C*-algebras- with the recent appearance of counterexamples to its strongest concrete form.
Resumo:
Inductive learning aims at finding general rules that hold true in a database. Targeted learning seeks rules for the predictions of the value of a variable based on the values of others, as in the case of linear or non-parametric regression analysis. Non-targeted learning finds regularities without a specific prediction goal. We model the product of non-targeted learning as rules that state that a certain phenomenon never happens, or that certain conditions necessitate another. For all types of rules, there is a trade-off between the rule's accuracy and its simplicity. Thus rule selection can be viewed as a choice problem, among pairs of degree of accuracy and degree of complexity. However, one cannot in general tell what is the feasible set in the accuracy-complexity space. Formally, we show that finding out whether a point belongs to this set is computationally hard. In particular, in the context of linear regression, finding a small set of variables that obtain a certain value of R2 is computationally hard. Computational complexity may explain why a person is not always aware of rules that, if asked, she would find valid. This, in turn, may explain why one can change other people's minds (opinions, beliefs) without providing new information.
Resumo:
Consider a Riemannian manifold equipped with an infinitesimal isometry. For this setup, a unified treatment is provided, solely in the language of Riemannian geometry, of techniques in reduction, linearization, and stability of relative equilibria. In particular, for mechanical control systems, an explicit characterization is given for the manner in which reduction by an infinitesimal isometry, and linearization along a controlled trajectory "commute." As part of the development, relationships are derived between the Jacobi equation of geodesic variation and concepts from reduction theory, such as the curvature of the mechanical connection and the effective potential. As an application of our techniques, fiber and base stability of relative equilibria are studied. The paper also serves as a tutorial of Riemannian geometric methods applicable in the intersection of mechanics and control theory.
Resumo:
We prove that the Cuntz semigroup is recovered functorially from the Elliott invariant for a large class of C¤-algebras. In particular, our results apply to the largest class of simple C¤-algebras for which K-theoretic classification can be hoped for. This work has three significant consequences. First, it provides new conceptual insight into Elliott's classification program, proving that the usual form of the Elliott conjecture is equivalent, among Z-stable algebras, to a conjecture which is in general substantially weaker and for which there are no known counterexamples. Second and third, it resolves, for the class of algebras above, two conjectures of Blackadar and Handelman concerning the basic structure of dimension functions on C¤-algebras. We also prove in passing that the Kuntz-Pedersen semigroup is recovered functorially from the Elliott invariant for all simple unital C¤-algebras of interest.
Resumo:
We analyse the implications of optimal taxation for the stochastic behaviour of debt. We show that when a government pursues an optimal fiscal policy under complete markets, the value of debt has the same or less persistence than other variables in the economy and it declines in response to shocks that cause the deficit to increase. By contrast, under incomplete markets debt shows more persistence than other variables and it increases in response to shocks that cause a higher deficit. Data for US government debt reveals diametrically opposite results from those of complete markets and is much more supportive of bond market incompleteness.
Resumo:
Assuming the role of debt management is to provide hedging against fiscal shocks we consider three questions: i) what indicators can be used to assess the performance of debt management? ii) how well have historical debt management policies performed? and iii) how is that performance affected by variations in debt issuance? We consider these questions using OECD data on the market value of government debt between 1970 and 2000. Motivated by both the optimal taxation literature and broad considerations of debt stability we propose a range of performance indicators for debt management. We evaluate these using Monte Carlo analysis and find that those based on the relative persistence of debt perform best. Calculating these measures for OECD data provides only limited evidence that debt management has helped insulate policy against unexpected fiscal shocks. We also find that the degree of fiscal insurance achieved is not well connected to cross country variations in debt issuance patterns. Given the limited volatility observed in the yield curve the relatively small dispersion of debt management practices across countries makes little difference to the realised degree of fiscal insurance.
Resumo:
BACKGROUND: Highway maintenance workers are constantly and simultaneously exposed to traffic-related particle and noise emissions, and both have been linked to increased cardiovascular morbidity and mortality in population-based epidemiology studies. OBJECTIVES: We aimed to investigate short-term health effects related to particle and noise exposure. METHODS: We monitored 18 maintenance workers, during as many as five 24-hour periods from a total of 50 observation days. We measured their exposure to fine particulate matter (PM2.5), ultrafine particles, noise, and the cardiopulmonary health endpoints: blood pressure, pro-inflammatory and pro-thrombotic markers in the blood, lung function and fractional exhaled nitric oxide (FeNO) measured approximately 15 hours post-work. Heart rate variability was assessed during a sleep period approximately 10 hours post-work. RESULTS: PM2.5 exposure was significantly associated with C-reactive protein and serum amyloid A, and negatively associated with tumor necrosis factor α. None of the particle metrics were significantly associated with von Willebrand factor or tissue factor expression. PM2.5 and work noise were associated with markers of increased heart rate variability, and with increased HF and LF power. Systolic and diastolic blood pressure on the following morning were significantly associated with noise exposure after work, and non-significantly associated with PM2.5. We observed no significant associations between any of the exposures and lung function or FeNO. CONCLUSIONS: Our findings suggest that exposure to particles and noise during highway maintenance work might pose a cardiovascular health risk. Actions to reduce these exposures could lead to better health for this population of workers.
Resumo:
A growing literature integrates theories of debt management into models of optimal fiscal policy. One promising theory argues that the composition of government debt should be chosen so that fluctuations in the market value of debt offset changes in expected future deficits. This complete market approach to debt management is valid even when the government only issues non-contingent bonds. A number of authors conclude from this approach that governments should issue long term debt and invest in short term assets. We argue that the conclusions of this approach are too fragile to serve as a basis for policy recommendations. This is because bonds at different maturities have highly correlated returns, causing the determination of the optimal portfolio to be ill-conditioned. To make this point concrete we examine the implications of this approach to debt management in various models, both analytically and using numerical methods calibrated to the US economy. We find the complete market approach recommends asset positions which are huge multiples of GDP. Introducing persistent shocks or capital accumulation only worsens this problem. Increasing the volatility of interest rates through habits partly reduces the size of these simulations we find no presumption that governments should issue long term debt ? policy recommendations can be easily reversed through small perturbations in the specification of shocks or small variations in the maturity of bonds issued. We further extend the literature by removing the assumption that governments every period costlessly repurchase all outstanding debt. This exacerbates the size of the required positions, worsens their volatility and in some cases produces instability in debt holdings. We conclude that it is very difficult to insulate fiscal policy from shocks by using the complete markets approach to debt management. Given the limited variability of the yield curve using maturities is a poor way to substitute for state contingent debt. The result is the positions recommended by this approach conflict with a number of features that we believe are important in making bond markets incomplete e.g allowing for transaction costs, liquidity effects, etc.. Until these features are all fully incorporated we remain in search of a theory of debt management capable of providing robust policy insights.