180 resultados para Converse Sneaker
Resumo:
The theory and approach of the broadband teleseismic body waveform inversion are expatiated in this paper, and the defining the crust structure's methods are developed. Based on the teleseismic P-wave data, the theoretic image of the P-wave radical component is calculated via the convolution of the teleseismic P-wave vertical component and the transform function, and thereby a P-wavefrom inversion method is built. The applied results show the approach effective, stable and its resolution high. The exact and reliable teleseismic P waveforms recorded by CDSN and IRIS and its geodynamics are utilized to obtain China and its vicinage lithospheric transfer functions, this region ithospheric structure is inverted through the inversion of reliable transfer functions, the new knowledge about the deep structure of China and its vicinage is obtained, and the reliable seismological evidence is provided to reveal the geodynamic evolution processes and set up the continental collisional theory. The major studies are as follows: Two important methods to study crustal and upper mantle structure -- body wave travel-time inversion and waveform modeling are reviewed systematically. Based on ray theory, travel-time inversion is characterized by simplicity, crustal and upper mantle velocity model can be obtained by using 1-D travel-time inversion preliminary, which introduces the reference model for studying focal location, focal mechanism, and fine structure of crustal and upper mantle. The large-scale lateral inhomogeneity of crustal and upper mantle can be obtained by three-dimensional t ravel-time seismic tomography. Based on elastic dynamics, through the fitting between theoretical seismogram and observed seismogram, waveform modeling can interpret the detail waveform and further uncover one-dimensional fine structure and lateral variation of crustal and upper mantle, especially the media characteristics of singular zones of ray. Whatever travel-time inversion and waveform modeling is supposed under certain approximate conditions, with respective advantages and disadvantages, and provide convincing structure information for elucidating physical and chemical features and geodynamic processes of crustal and upper mantle. Because the direct wave, surface wave, and refraction wave have lower resolution in investigating seismic velocity transitional zone, which is inadequate to study seismic discontinuities. On the contrary, both the converse and reflected wave, which sample the discontinuities directly, must be carefully picked up from seismogram to constrain the velocity transitional zones. Not only can the converse wave and reflected wave study the crustal structure, but also investigate the upper mantle discontinuities. There are a number of global and regional seismic discontinuities in the crustal and upper mantle, which plays a significant role in understanding physical and chemical properties and geodynamic processes of crustal and upper mantle. The broadband teleseismic P waveform inversion is studied particularly. The teleseismic P waveforms contain a lot of information related to source time function, near-source structure, propagation effect through the mantle, receiver structure, and instrument response, receiver function is isolated form teleseismic P waveform through the vector rotation of horizontal components into ray direction and the deconvolution of vertical component from the radial and tangential components of ground motion, the resulting time series is dominated by local receiver structure effect, and is hardly irrelevant to source and deep mantle effects. Receiver function is horizontal response, which eliminate multiple P wave reflection and retain direct wave and P-S converted waves, and is sensitive to the vertical variation of S wave velocity. Velocity structure beneath a seismic station has different response to radial and vertical component of an accident teleseismic P wave. To avoid the limits caused by a simplified assumption on the vertical response, the receiver function method is mended. In the frequency domain, the transfer function is showed by the ratio of radical response and vertical response of the media to P wave. In the time domain, the radial synthetic waveform can be obtained by the convolution of the transfer function with the vertical wave. In order to overcome the numerical instability, generalized reflection and transmission coefficient matrix method is applied to calculate the synthetic waveform so that all multi-reflection and phase conversion response can be included. A new inversion method, VFSA-LM method, is used in this study, which successfully combines very fast simulated annealing method (VFSA) with damped least square inversion method (LM). Synthetic waveform inversion test confirms its effectiveness and efficiency. Broadband teleseismic P waveform inversion is applied in lithospheric velocity study of China and its vicinage. According to the data of high quality CDSN and IRIS, we obtained an outline map showing the distribution of Asian continental crustal thickness. Based on these results gained, the features of distribution of the crustal thickness and outline of crustal structure under the Asian continent have been analyzed and studied. Finally, this paper advances the principal characteristics of the Asian continental crust. There exist four vast areas of relatively minor variations in the crustal thickness, namely, northern, eastern southern and central areas of Asian crust. As a byproduct, the earthquake location is discussed, Which is a basic issue in seismology. Because of the strong trade-off between the assumed initial time and focal depth and the nonlinear of the inversion problems, this issue is not settled at all. Aimed at the problem, a new earthquake location method named SAMS method is presented, In which, the objective function is the absolute value of the remnants of travel times together with the arrival times and use the Fast Simulated Annealing method is used to inverse. Applied in the Chi-Chi event relocation of Taiwan occurred on Sep 21, 2000, the results show that the SAMS method not only can reduce the effects of the trade-off between the initial time and focal depth, but can get better stability and resolving power. At the end of the paper, the inverse Q filtering method for compensating attenuation and frequency dispersion used in the seismic section of depth domain is discussed. According to the forward and inverse results of synthesized seismic records, our Q filtrating operator of the depth domain is consistent with the seismic laws in the absorbing media, which not only considers the effect of the media absorbing of the waves, but also fits the deformation laws, namely the frequency dispersion of the body wave. Two post stacked profiles about 60KM, a neritic area of China processed, the result shows that after the forward Q filtering of the depth domain, the wide of the wavelet of the middle and deep layers is compressed, the resolution and signal noise ratio are enhanced, and the primary sharp and energy distribution of the profile are retained.
Resumo:
Brand image is the attributes set and the related associations of a brand in consumers mind, and it is the subjective reflections of brands. The paper explored the factors of brand image system and their weight. As the traditional means to evaluate weight coefficients are not perfect, a new method, conjoint analysis, was attempted. The factors of brand image were explored through questionnaire. Sports sneaker, toothpaste, and personal stereo were chosen as product sample, and four hundred and twenty university students from Tangshan city and Beijing as subjects(each person evaluated two kinds of products). The first two kinds of products were requisites of students, and sports sneaker belonged to High conspicuous products and toothpaste was low conspicuous product. On the other hand, personal stereo was the sample of development and entertainment products. Several factors of three products brand were taken out with factor analysis. In order to explore the weight of the brand image factors, a contrast of factor contribution ratio method, holistic quartation method and conjoint analysis is made here. Twenty university students evaluated the weight of the image factors of three kinds of brand with holistic quartation method, then they gave the weight of personal stereo with conjoint analysis method. Product function, advertising and propaganda, symbolic meaning, market orientation, brand appetency, consuming experience are the factors of sports sneaker brand image. Product function, advertising and propaganda, market orientation, product grade, corporation image are the factors of toothpaste brand image. Corporation image and product function, advertising and propaganda, consuming experience, symbolic meaning, price and function ratio are the factors of personal stereo brand image. So the hypothesis was proved that brand image is an ordinal and organical system, "ordinal" means the weight of factors are different, "organical" means that brand image can be deposed into several factors and the factors belonged to function components and meaning components (function components are the factors about physical characteristics and function, which are called "hard factors"); meaning factors are those that can show the personality、value and lifestyle of consumers, which are called "soft factors". The research also gave evidence of the hypothesis below: the factor structures of brand image of different product category have commonness and individuality; the function components of low conspicuous products are more important than the high conspicuous products. The exploration of conjoint analysis is what the paper seeks to be some creative in some degree.
Resumo:
Both multilayer perceptrons (MLP) and Generalized Radial Basis Functions (GRBF) have good approximation properties, theoretically and experimentally. Are they related? The main point of this paper is to show that for normalized inputs, multilayer perceptron networks are radial function networks (albeit with a non-standard radial function). This provides an interpretation of the weights w as centers t of the radial function network, and therefore as equivalent to templates. This insight may be useful for practical applications, including better initialization procedures for MLP. In the remainder of the paper, we discuss the relation between the radial functions that correspond to the sigmoid for normalized inputs and well-behaved radial basis functions, such as the Gaussian. In particular, we observe that the radial function associated with the sigmoid is an activation function that is good approximation to Gaussian basis functions for a range of values of the bias parameter. The implication is that a MLP network can always simulate a Gaussian GRBF network (with the same number of units but less parameters); the converse is true only for certain values of the bias parameter. Numerical experiments indicate that this constraint is not always satisfied in practice by MLP networks trained with backpropagation. Multiscale GRBF networks, on the other hand, can approximate MLP networks with a similar number of parameters.
Resumo:
Computers and Thought are the two categories that together define Artificial Intelligence as a discipline. It is generally accepted that work in Artificial Intelligence over the last thirty years has had a strong influence on aspects of computer architectures. In this paper we also make the converse claim; that the state of computer architecture has been a strong influence on our models of thought. The Von Neumann model of computation has lead Artificial Intelligence in particular directions. Intelligence in biological systems is completely different. Recent work in behavior-based Artificial Intelligenge has produced new models of intelligence that are much closer in spirit to biological systems. The non-Von Neumann computational models they use share many characteristics with biological computation.
Resumo:
In this thesis a novel theory of electrocatalysis at metal (especially noble metal)/solution interfaces was developed based on the assumption of metal adatom/incipient hydrous oxide cyclic redox transitions. Adatoms are considered as metastable, low coverage species that oxidise in-situ at potentials of often significantly cathodic to the regular metal/metal oxide transition. Because the adatom coverage is so low the electrochemical or spectroscopic response for oxidation is frequently overlooked; however, the product of such oxidation, referred to here as incipient hydrous oxide seems to be the important mediator in a wide variety of electrocatalytically demanding oxidation processes. Conversely, electrocatalytically demanding reductions apparently occur only at adatom sites at the metal/solution interface - such reactions generally occur only at potentials below, i.e. more cathodic than, the adatom/hydrous oxide transition. It was established that while silver in base oxidises in a regular manner (forming initially OHads species) at potentials above 1.0 V (RHE), there is a minor redox transition at much lower potentials, ca. o.35 v (RHE). The latter process is assumed to an adatom/hydrous oxide transition and the low coverage Ag(l) hydrous oxide (or hydroxide) species was shown to trigger or mediate the oxidation of aldehydes, e. g. HCHO. The results of a study of this system were shown to be in good agreement with a kinetic model based on the above assumptions; the similarity between this type of behaviour and enzyme-catalysed processes - both systems involve interfacial active sites - was pointed out. Similar behaviour was established for gold where both Au(l) and Au(lll) hydrous oxide mediators were shown to be the effective oxidants for different organic species. One of the most active electrocatalytic materials known at the present time is platinum. While the classical view of this high activity is based on the concept of activated chemisorption (and the important role of the latter is not discounted here) a vital role is attributed to the adatom/hydrous oxide transition. It was suggested that the well known intermediate (or anomalous) peak in the hydrogen region of the cyclic voltanmogram for platinum region is in fact due to an adatom/hydrous oxide transition. Using potential stepping procedures to minimise the effect of deactivating (COads) species, it was shown that the onset (anodic sweep) and termination (cathodic sweep) potential for the oxidation of a wide variety of organics coincided with the potential for the intermediate peak. The converse was also shown to apply; sluggish reduction reactions, that involve interaction with metal adatoms, occur at significant rates only in the region below the hydrous oxide/adatom transition.
Resumo:
We present a novel system to be used in the rehabilitation of patients with forearm injuries. The system uses surface electromyography (sEMG) recordings from a wireless sleeve to control video games designed to provide engaging biofeedback to the user. An integrated hardware/software system uses a neural net to classify the signals from a user’s muscles as they perform one of a number of common forearm physical therapy exercises. These classifications are used as input for a suite of video games that have been custom-designed to hold the patient’s attention and decrease the risk of noncompliance with the physical therapy regimen necessary to regain full function in the injured limb. The data is transmitted wirelessly from the on-sleeve board to a laptop computer using a custom-designed signal-processing algorithm that filters and compresses the data prior to transmission. We believe that this system has the potential to significantly improve the patient experience and efficacy of physical therapy using biofeedback that leverages the compelling nature of video games.
Resumo:
Delivering a lecture requires confidence, a sound knowledge and well developed teaching skills (Cooper and Simonds, 2007, Quinn and Hughes, 2007). However, practitioners who are new to lecturing large groups in higher education may initially lack the confidence to do so which can manifest itself in their verbal and non-verbal cues and the fluency of their teaching skills. This results in the perception that students can identify the confident and non-confident teacher during a lecture (Street, 2007) and so potentially contributing to a lecturer’s level of anxiety prior to, and during, a lecture. Therefore, in the current educational climate of consumerisation, with the increased evaluation of teaching by students, having the ability to deliver high-quality, informed, and interesting lectures assumes greater significance for both lecturers and universities (Carr, 2007; Higher Education Founding Council 2008, Glass et al., 2006). This paper will present both the quantitative and qualitative data from a two-phase mixed method study with 75 nurse lecturers and 62 nursing students in one university in the United Kingdom. The study investigated the notion that lecturing has similarities to acting (Street, 2007). The findings presented here are concerned with how students perceived lecturers’ level of confidence and how lecturers believed they demonstrated confidence. In phase one a specifically designed questionnaire was distributed to both lecturers and students and a response rate of 91% (n=125) was achieved, while in phase two 12 in-depth semi-structured interviews were conducted with lecturers. Results suggested that students in a lecture could identify if the lecturer was confident or not by the way they performed a lecture. Students identified 57 manifestations of non-confidence and lecturers identified 85, while 57 manifestations of confidence were identified by students and 88 by lecturers. Overall, these fell into 12 main converse categories, ranging from body language to the use of space within the room. Both students and lecturers ranked body language, vocal qualities, delivery skills, involving the students and the ability to share knowledge as the most evident manifestations of confidence. Elements like good eye contact, smiling, speaking clearly and being fluent in the use of media recourses where all seen as manifestations confidence, conversely if these were poorly executed then a presentation of under confidence was evident. Furthermore, if the lecturer appeared enthusiastic it was clearly underpinned by the manifestation of a highly confidence lecturer who was secure in their knowledge base and teaching abilities: Some lecturers do appear enthusiastic but others don’t. I think the ones that do know what they are talking about, you can see it in their voice and in their lively body language. I think they are also good at involving the students even. I think the good ones are able to turn boring subjects into lively and interesting ones. (Student 50) Significantly more lecturers than students felt the lecturer should appear confident when lecturing. The lecturers stated it was particularly important to do so when they did not feel confident, because they were concerned with appearing capable. It seems that these students and lecturers perceived that expressive and apparently confident lecturers can make a positive impact on student groups in terms of involvement in lectures; the data also suggested the reverse, for the under confident lecturer. Findings from phase two indicated that these lecturers assumed a persona when lecturing, particularly, but not exclusively, when they were nervous. These lecturers went through a process of assuming and maintaining this persona before and during a lecture as a way of promoting their internal perceptions of confidence but also their outward manifestation of confidence. Although assuming a convincing persona may have a degree of deception about it, providing the knowledge communicated is accurate, the deception may aid rather than hinder learning, because enhances the delivery of a lecture. Therefore, the deception of acting a little more confidently than one feels might be justified when the lecturer knows the knowledge they are communicating is correct, unlike the Dr Fox Effect where the person delivering a lecture is an actor and does not know the subject in any detail or depth and where the deception to be justified (Naftulin, et al., 1973). In conclusion, these students and lecturers perceive that confident and enthusiastic lecturers communicate their passion for the subject in an interesting and meaningful manner through the use of their voice, body, space and interactions in such a way that shows confidence in their knowledge as well as their teaching abilities. If lecturers, therefore, can take a step back to consider how they deliver lectures in apparently confident ways this may increase their ability to engage their students and not only help them being perceived as good lecturers, but also contribute to the genuine act of education.
Resumo:
The lengths, wet and dry weights, nitrogen and carbon contents of fresh, frozen and formaldehyde-fixed specimens of Calanus helgolandicus (Claus) were determined. Samples were collected during May 1980 in the Celtic Sea. Individual Copepodite Stages 3, 4, 5, and Adult Male and Female Stage 6 were measured and analysed, and 36 linear regression equations derived for these variables together with mean values, standard deviations and 95% confidence limits. The range of nitrogen values in the fresh material, expressed as a percentage of dry weight, ranged from 8.08%±0.80 (Copepodite Stage 3) to 10.89%±0.27 (adult female); carbon values changed from 41.6%±3.05 (mean ±95% confidence limits) for Copepodite Stage 3 to 50.97%±2.63 in Copepodite Stage 5. The adult females had a high nitrogen and relatively low carbon content, while the converse was true for Stage 5 copepodites. There was a loss of dry weight from the frozen samples (57%) and the fixed samples (38%) compared with the mean of the fresh dry weight of all stages. The material lost from the copepods was rich in nitrogen, thus, artificially high percentage carbon values were determined from the frozen and fixed samples (52.0 to 60.3% and 44.7 to 58.5%, respectively).
Resumo:
The vertical distributions of the spring populations of Calanus finmarchicus (Gunnerus) and C. helgolandicus Claus are described and compared. The differences we observed between the two species have probably confused the understanding of the vertical distribution and development of the populations of Calanus spp. in the shelf seas around the United Kingdom where the species occur together. The results imply that these two congeneric species have different behaviour patterns which minimise interspecific competition where the species have sympatric distributions. C. finmarchicus has its younger development stages overlying the older stages in the water column. In C. helgolandicus the converse is true; i. e., the majority of the populations of Stage I and II copepodites of the first spring generations are found below the thermocline. It is also suggested that the different behaviour patterns lead to different feeding regimes and strategies.
Resumo:
The oceans and coastal seas provide mankind with many benefits including food for around a third of the global population, the air that we breathe and our climate system which enables habitation of much of the planet. However, the converse is that generation of natural events (such as hurricanes, severe storms and tsunamis) can have devastating impacts on coastal populations, while pollution of the seas by pathogens and toxic waste can cause illness and death in humans and animals. Harmful effects from biogenic toxins produced by algal blooms (HABs) and from the pathogens associated with microbial pollution are also a health hazard in seafood and from direct contact with water. The overall global burden of human disease caused by sewage pollution of coastal waters has been estimated at 4 million lost person-years annually. Finally, the impacts of all of these issues will be exacerbated by climate change. A holistic systems approach is needed. It must consider whole ecosystems, and their sustainability, such as integrated coastal zone management, is necessary to address the highly interconnected scientific challenges of increased human population pressure, pollution and over-exploitation of food (and other) resources as drivers of adverse ecological, social and economic impacts. There is also an urgent and critical requirement for effective and integrated public health solutions to be developed through the formulation of politically and environmentally meaningful policies. The research community required to address "Oceans & Human Health" in Europe is currently very fragmented, and recognition by policy makers of some of the problems, outlined in the list of challenges above, is limited. Nevertheless, relevant key policy issues for governments worldwide include the reduction of the burden of disease (including the early detection of emerging pathogens and other threats) and improving the quality of the global environment. Failure to effectively address these issues will impact adversely on efforts to alleviate poverty, sustain the availability of environmental goods and services and improve health and social and economic stability; and thus, will impinge on many policy decisions, both nationally and internationally. Knowledge exchange (KE) will be a key element of any ensuing research. KE will facilitate the integration of biological, medical, epidemiological, social and economic disciplines, as well as the emergence of synergies between seemingly unconnected areas of science and socio-economic issues, and will help to leverage knowledge transfer across the European Union (EU) and beyond. An integrated interdisciplinary systems approach is an effective way to bring together the appropriate groups of scientists, social scientists, economists, industry and other stakeholders with the policy formulators in order to address the complexities of interfacial problems in the area of environment and human health. The Marine Board of the European Science Foundation Working Group on "Oceans and Human Health" has been charged with developing a position paper on this topic with a view to identifying the scientific, social and economic challenges and making recommendations to the EU on policy-relevant research and development activities in this arena. This paper includes the background to health-related issues linked to the coastal environment and highlights the main arguments for an ecosystem-based whole systems approach.
Resumo:
BACKGROUND: Hypertension and cognitive impairment are prevalent in older people. It is known that hypertension is a direct risk factor for vascular dementia and recent studies have suggested hypertension also impacts upon prevalence of Alzheimer's disease. The question is therefore whether treatment of hypertension lowers the rate of cognitive decline. OBJECTIVES: To assess the effects of blood pressure lowering treatments for the prevention of dementia and cognitive decline in patients with hypertension but no history of cerebrovascular disease. SEARCH STRATEGY: The trials were identified through a search of CDCIG's Specialised Register, CENTRAL, MEDLINE, EMBASE, PsycINFO and CINAHL on 27 April 2005. SELECTION CRITERIA: Randomized, double-blind, placebo controlled trials in which pharmacological or non-pharmacological interventions to lower blood pressure were given for at least six months. DATA COLLECTION AND ANALYSIS: Two independent reviewers assessed trial quality and extracted data. The following outcomes were assessed: incidence of dementia, cognitive change from baseline, blood pressure level, incidence and severity of side effects and quality of life. MAIN RESULTS: Three trials including 12,091 hypertensive subjects were identified. Average age was 72.8 years. Participants were recruited from industrialised countries. Mean blood pressure at entry across the studies was 170/84 mmHg. All trials instituted a stepped care approach to hypertension treatment, starting with a calcium-channel blocker, a diuretic or an angiotensin receptor blocker. The combined result of the three trials reporting incidence of dementia indicated no significant difference between treatment and placebo (Odds Ratio (OR) = 0.89, 95% CI 0.69, 1.16). Blood pressure reduction resulted in a 11% relative risk reduction of dementia in patients with no prior cerebrovascular disease but this effect was not statistically significant (p = 0.38) and there was considerable heterogeneity between the trials. The combined results from the two trials reporting change in Mini Mental State Examination (MMSE) did not indicate a benefit from treatment (Weighted Mean Difference (WMD) = 0.10, 95% CI -0.03, 0.23). Both systolic and diastolic blood pressure levels were reduced significantly in the two trials assessing this outcome (WMD = -7.53, 95% CI -8.28, -6.77 for systolic blood pressure, WMD = -3.87, 95% CI -4.25, -3.50 for diastolic blood pressure).Two trials reported adverse effects requiring discontinuation of treatment and the combined results indicated a significant benefit from placebo (OR = 1.18, 95% CI 1.06, 1.30). When analysed separately, however, more patients on placebo in SCOPE were likely to discontinue treatment due to side effects; the converse was true in SHEP 1991. Quality of life data could not be analysed in the three studies. There was difficulty with the control group in this review as many of the control subjects received antihypertensive treatment because their blood pressures exceeded pre-set values. In most cases the study became a comparison between the study drug against a usual antihypertensive regimen. AUTHORS' CONCLUSIONS: There was no convincing evidence from the trials identified that blood pressure lowering prevents the development of dementia or cognitive impairment in hypertensive patients with no apparent prior cerebrovascular disease. There were significant problems identified with analysing the data, however, due to the number of patients lost to follow-up and the number of placebo patients given active treatment. This introduced bias. More robust results may be obtained by analysing one year data to reduce differential drop-out or by conducting a meta-analysis using individual patient data.
Resumo:
Originally applying solely to chefs, waiters, dishwashers and the like, New York City (NYC) regulations governing cabaret employees were altered in 1943 to include musicians and entertainers, who until the late 1960’s would be required to hold a NYC Cabaret Employee’s Identification Card. The introduction of these notorious “police cards” occurred roughly contemporaneously to the emergence in after-hours night clubs in Harlem of a new and supposedly “wild”, improvisatory brand of jazz: bebop. This article adds to the many rather practical theories on why these cards were introduced a more abstract discussion coined in terms of the relationship between suspicion and tradition and focusing on differing essences of law and improvisatory jazz. While law breathes tradition and is suspicious of improvisation and unpredictability, the converse is true of jazz. Allusion to tradition in jazz improvisation is often viewed as a betrayal of its creative and spontaneous nature. And yet it is only through its departure from the stable transmission of past meaning that improvisation gains meaning. Law, in contrast, while appearing to be entirely composed of tradition, to transmit some sort of determinate and fixed meaning, is constantly betraying itself. As no two legal actions can be exactly the same, judges must improvise on tradition and past precedent every time they are asked to decide a case. Law can thus neither dispense with nor be completely determined by tradition. The legal decision instead lies on the border between what it “is” and what it otherwise could be and every judicial act is, in some sense, a species of improvisation. This article uses the cabaret cards to explore this uncertain terrain between law and improvisation, between tradition and suspicion.
Resumo:
We study the solution concepts of partial cooperative Cournot-Nash equilibria and partial cooperative Stackelberg equilibria. The partial cooperative Cournot-Nash equilibrium is axiomatically characterized by using notions of rationality, consistency and converse consistency with regard to reduced games. We also establish sufficient conditions for which partial cooperative Cournot-Nash equilibria and partial cooperative Stackelberg equilibria exist in supermodular games. Finally, we provide an application to strategic network formation where such solution concepts may be useful.
Resumo:
Background: This is an update of a previous review (McGuinness 2006). Hypertension and cognitive impairment are prevalent in older people. Hypertension is a direct risk factor for vascular dementia (VaD) and recent studies have suggested hypertension impacts upon prevalence of Alzheimer's disease (AD). Therefore does treatment of hypertension prevent cognitive decline?
Objectives: To assess the effects of blood pressure lowering treatments for the prevention of dementia and cognitive decline in patients with hypertension but no history of cerebrovascular disease.
Search strategy: The Specialized Register of the Cochrane Dementia and Cognitive Improvement Group, The Cochrane Library, MEDLINE, EMBASE, PsycINFO, CINAHL, LILACS as well as many trials databases and grey literature sources were searched on 13 February 2008 using the terms: hypertens$ OR anti-hypertens$. Selection criteria: Randomized, double-blind, placebo controlled trials in which pharmacological or non-pharmacological interventions to lower blood pressure were given for at least six months.
Data collection and analysis: Two independent reviewers assessed trial quality and extracted data. The following outcomes were assessed: incidence of dementia, cognitive change from baseline, blood pressure level, incidence and severity of side effects and quality of life.
Main results: Four trials including 15,936 hypertensive subjects were identified. Average age was 75.4 years. Mean blood pressure at entry across the studies was 171/86 mmHg. The combined result of the four trials reporting incidence of dementia indicated no significant difference between treatment and placebo (236/7767 versus 259/7660, Odds Ratio (OR) = 0.89, 95% CI 0.74, 1.07) and there was considerable heterogeneity between the trials. The combined results from the three trials reporting change in Mini Mental State Examination (MMSE) did not indicate a benefit from treatment (Weighted Mean Difference (WMD) = 0.42, 95%CI 0.30, 0.53). Both systolic and diastolic blood pressure levels were reduced significantly in the three trials assessing this outcome (WMD = -10.22, 95% CI -10.78, -9.66 for systolic blood pressure, WMD = -4.28, 95% CI -4.58, -3.98 for diastolic blood pressure). Three trials reported adverse effects requiring discontinuation of treatment and the combined results indicated no significant difference (OR = 1.01, 95% CI 0.92, 1.11). When analysed separately, however, more patients on placebo in Syst Eur 1997 were likely to discontinue treatment due to side effects; the converse was true in SHEP 1991. Quality of life data could not be analysed in the four studies. Analysis of the included studies in this review was problematic as many of the control subjects received antihypertensive treatment because their blood pressures exceeded pre-set values. In most cases the study became a comparison between the study drug against a usual antihypertensive regimen.
Authors' conclusions: There is no convincing evidence fromthe trials identified that blood pressure lowering in late-life prevents the development of dementia or cognitive impairment in hypertensive patients with no apparent prior cerebrovascular disease. There were significant problems identified with analysing the data, however, due to the number of patients lost to follow-up and the number of placebo patients who received active treatment. This introduced bias. More robust results may be obtained by conducting a meta-analysis using individual patient data.
Resumo:
The predominant fear in capital markets is that of a price spike. Commodity markets differ in that there is a fear of both upward and down jumps, this results in implied volatility curves displaying distinct shapes when compared to equity markets. The use of a novel functional data analysis (FDA) approach, provides a framework to produce and interpret functional objects that characterise the underlying dynamics of oil future options. We use the FDA framework to examine implied volatility, jump risk, and pricing dynamics within crude oil markets. Examining a WTI crude oil sample for the 2007–2013 period, which includes the global financial crisis and the Arab Spring, strong evidence is found of converse jump dynamics during periods of demand and supply side weakness. This is used as a basis for an FDA-derived Merton (1976) jump diffusion optimised delta hedging strategy, which exhibits superior portfolio management results over traditional methods.