809 resultados para Change-over Designs


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite its large impact on the individual and society, we currently have only a rudimentary understanding of the biological basis of Major Depressive Disorder, even less so in adolescent populations. This thesis focuses on two research questions. First, how do adolescents with depression differ from adolescents who have never been depressed on (1a) brain morphology and (1b) DNA methylation? We studied differences in the fronto-limbic system (a collection of areas responsible for emotion regulation) and methylation at the serotonin transporter (SLC6A4) and FK506 binding protein gene (FKBP5) genes (two genes strongly linked to stress regulation and depression). Second, how does childhood trauma, which is known to increase risk for depression, affect (2a) brain development and (2b) SLC6A4 and FKBP5 methylation? Further, (2c) how might DNA methylation explain how trauma affects brain development in depression? We studied these questions in 24 adolescent depressed patients and 21 controls. We found that (1a) depressed adolescents had decreased left precuneus volume and greater volume of the left precentral gyrus compared to controls; however, no differences in fronto-limbic morphology were identified. Moreover, (1b) individuals with depression had lower levels of FKBP5 methylation than controls. In line with our second hypothesis (2a) greater levels of trauma were associated with decreased volume of a number of fronto-limbic regions. Further, we found that (2b) greater trauma was associated with decreased SLC6A4, but not FKBP5, methylation. Finally, (2c) greater FKBP5, but not SLC6A4, methylation was associated with decreased volume of a number of fronto-limbic regions. The results of this study suggest an association among trauma, DNA methylation and brain development in youth, but the direction of these relationships appears to be inconsistent. Future studies using a longitudinal design will be necessary to clarify these results and help us understand how the brain and epigenome change over time in depressed youth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Follow-up care aims to provide surveillance with early detection of recurring cancers and to address treatment complications and other health issues in survivorship. It is assumed that follow-up care fulfills these aims, however little evidence supports routine surveillance detecting curable disease early enough to improve survival. Cancer survivors are a diverse patient population, suggesting that a single follow-up regimen may not meet all patients’ follow-up needs. Little is known about what effective follow-up care should include for head and neck cancer patients in a Canadian setting. Identification of subgroups of patients with specific needs and current practices would allow for hypotheses to be generated for enhancing follow-up care. OBJECTIVES: 1a) To describe the follow-up needs and preferences of head and neck cancer patients, 1b) to identify which patient characteristics predict needs and preferences, 1c) to evaluate how needs and preferences change over time, 2a) to describe follow-up care practices by physician visits and imaging tests, and 2b) to identify factors associated to the delivered follow-up care. METHODS: 1) 175 patients who completed treatment between 2012 and 2013 in Kingston and London, Ontario were recruited to participate in a prospective survey study on patients’ needs and preferences in follow-up care. Bivariate and multivariate analyses were employed to describe patient survey responses and to identify patient characteristics that predicted needs and preferences. 2) A retrospective cohort study of 3975 patients on routine follow-up from 2007 to 2015 was carried out using data linkages across registry and administrative databases to describe follow-up practices in Ontario by visits and tests. Multivariate regression analyses assessed factors related to follow-up care. RESULTS: 1) Patients’ needs and preferences were wide-ranging with several characteristics predicting needs and preferences (ORECOG=2.69 and ORAnxiety=1.13). Needs and preferences declined as patients transitioned into their second year of follow-up (p<0.05). 2) Wide variation in practices was found, with marked differences compared to existing consensus guidelines. Multiple factors were associated with follow-up practices (RRTumor site=0.73 and RRLHIN=1.47). CONCLUSIONS: Patient characteristics can be used to personalize care and guidelines are not informing practice. Future research should evaluate individualized approaches to follow-up care.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Combining intrinsically conducting polymers with carbon nanotubes (CNT) helps in creating composites with superior electrical and thermal characteristics. These composites are capable of replacing metals and semiconductors as they possess unique combination of electrical conductivity, flexibility, stretchability, softness and bio-compatibility. Their potential for use in various organic devices such as super capacitors, printable conductors, optoelectronic devices, sensors, actuators, electrochemical devices, electromagnetic interference shielding, field effect transistors, LEDs, thermoelectrics etc. makes them excellent substitutes for present day semiconductors.However, many of these potential applications have not been fully exploited because of various open–ended challenges. Composites meant for use in organic devices require highly stable conductivity for the longevity of the devices. CNT when incorporated at specific proportions, and with special methods contributes quite positively to this end.The increasing demand for energy and depleting fossil fuel reserves has broadened the scope for research into alternative energy sources. A unique and efficient method for harnessing energy is thermoelectric energy conversion method. Here, heat is converted directly into electricity using a class of materials known as thermoelectric materials. Though polymers have low electrical conductivity and thermo power, their low thermal conductivity favours use as a thermoelectric material. The thermally disconnected, but electrically connected carrier pathways in CNT/Polymer composites can satisfy the so-called “phonon-glass/electron-crystal” property required for thermoelectric materials. Strain sensing is commonly used for monitoring in engineering, medicine, space or ocean research. Polymeric composites are ideal candidates for the manufacture of strain sensors. Conducting elastomeric composites containing CNT are widely used for this application. These CNT/Polymer composites offer resistance change over a large strain range due to the low Young‟s modulus and higher elasticity. They are also capable of covering surfaces with arbitrary curvatures.Due to the high operating frequency and bandwidth of electronic equipments electromagnetic interference (EMI) has attained the tag of an „environmental pollutant‟, affecting other electronic devices as well as living organisms. Among the EMI shielding materials, polymer composites based on carbon nanotubes show great promise. High strength and stiffness, extremely high aspect ratio, and good electrical conductivity of CNT make it a filler of choice for shielding applications. A method for better dispersion, orientation and connectivity of the CNT in polymer matrix is required to enhance conductivity and EMI shielding. This thesis presents a detailed study on the synthesis of functionalised multiwalled carbon nanotube/polyaniline composites and their application in electronic devices. The major areas focused include DC conductivity retention at high temperature, thermoelectric, strain sensing and electromagnetic interference shielding properties, thermogravimetric, dynamic mechanical and tensile analysis in addition to structural and morphological studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The need to steer economic development has always been great and as management model has the balanced scorecard has been popular since the mid- 1990s, mainly in the private sector but also in the municipal sector. The introduction of the balanced scorecard has been primarily to organizations to see more than economic dimensions. The Balanced Scorecard was originally a measurement system, and today it works more as a strategic instrument. In our study is a case study to evaluate a municipality and how they make use of the balanced scorecard as a tool for strategic and value-adding work in municipal activities. In the local business is it important that the organization adapts the balanced scorecard, so it fits on the basis that it is a politically driven organization, with mandates, committees and administrations. In our study, we used a qualitative method with a deductive approach. In the study, we have gathered information through a case study where we interviewed 7 people in leading positions. In our analysis and results section, we came to the conclusion that the municipality does not use the balanced scorecard correctly. We also found that the balanced scorecard as a tool for value creation and strategic planning does not work in a favorable way. In our study, we see difficulties with the implementation of the balanced scorecard. If the municipality has invested in implementing the balanced scorecard at all levels of the business so the municipality would be able to use it on one of the activities more adequately. When the municipality is a politically driven organization, it is important that vision alive and changing based on the conditions that reflect the outside world and the municipality in general. Looking at a vivid vision, goals and business ideas, it's balanced scorecard in line with how a balanced scorecard should look like. The municipality has a strategic plan in terms of staff and employees at large. In the study, we have seen that the strategic plan is not followed up in a good way and for the business favorably, the municipality chooses the easy way out for evaluation. Employee participation to changes and ongoing human resources management feels nonexistent. However, as has been the vision of creating empowered and motivated employees. In our conclusion, we describe how we in our study look at the use of the balanced scorecard in municipal operations. We can also discern that a balanced scorecard as a tool for value creation and strategic work is good if it is used properly. In the study, we have concluded that the municipality we have chosen to study should not use the balanced scorecard when you have not created the tools and platforms required for employees, civil servants and politicians to evaluate, monitor and create a living scorecard change over time. The study reveals major shortcomings in the implementation, evaluation and follow-up possibilities, and the consequence of this is that the balanced scorecard is not - 4 - preferable in municipal operations as a strategic instrument for value creation and long-term planning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this research was to develop a methodology for transforming and dynamically segmenting data. Dynamic segmentation enables transportation system attributes and associated data to be stored in separate tables and merged when a specific query requires a particular set of data to be considered. A major benefit of dynamic segmentation is that individual tables can be more easily updated when attributes, performance characteristics, or usage patterns change over time. Applications of a progressive geographic database referencing system in transportation planning are vast. Summaries of system condition and performance can be made, and analyses of specific portions of a road system are facilitated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article shows in which way the so-called climate theories, which have been developed since Antiquity, change over the course of time and influence the different theories on the origin of language. Via Montesquieu and Rousseau, the “climate theories” have influenced Johann Gottfried Herder, who bases on the romantic concept of Volk. By this means, a lot of ideas come into being which are fundamental for the foundation and development of the national philologies in Europe.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Atualmente vivemos numa era em que a publicidade nos rodeia através de várias formas e onde as empresas esforçam-se cada vez mais para tornar eficaz a mensagem que pretendem passar. O uso de métodos convencionais, como a televisão, rádio, ou até outdoors, está a tornar-se pouco eficaz. Em muito pouco tempo, nos últimos vinte anos, a Internet mudou a nossa forma de viver, sendo até comparado ao Renascimento e à Revolução Industrial. As gerações mais recentes nasceram rodeadas deste “boom” publicitário, o que as tornou imunes. De forma a contornar este problema, surge Levinson em 1989 onde apresenta uma forma de minimizar este efeito e ao mesmo tempo proporcionar a que pequenas empresas tenham capacidade de competir com as maiores (Levinson, 2007). Assim, o marketing de guerrilha caracteriza-se por estar normalmente associado a implementações de baixo custo, que por vezes são irrepetíveis, pois conseguem alcançar um impacto “wow” significativo junto do grande público (Oliveira & Ferreira, 2013). O presente estudo contribui para a literatura do marketing de guerrilha existente, realizando assim uma compilação acerca do desenvolvimento desta temática até aos dias de hoje. De forma a perceber quais são os fatores que influenciam o uso do marketing de guerrilha pelas empresas portuguesas, foram inquiridas 140 empresas de todo o país, através de um questionário com base no estudo desenvolvido por Overbeek (2012). Através desta investigação exploratória, numa área ainda pouco explorada em Portugal, até à data, em especial a nível académico, “verificou-se que existe uma grande procura por este tipo de ferramentas não convencionais, tanto que, verificou-se que 86,4% da amostra já presenciou uma ação de guerrilha, no entanto apenas 36,4% admite já ter implementado na sua empresa, o que levanta a questão do porquê de uma taxa tão reduzida de utilização deste tipo de abordagem não convencional (Almeida & Au-Yong-Oliveira, 2015, p.1). A explicação poderá estar ligada à grande aversão à incerteza que existe em Portugal (Hofstede, 2001), e ao receio da mudança e da experimentação de novos produtos em Portugal (Steenkamp et al., 1999). Fatores que não irão mudar durante décadas, dado o tempo que demora a mudar culturas nacionais (Hofstede, 2001). Verifica-se também que na amostra das 140 empresas se destacam pessoas formadas (ao grau de licenciatura e mestrado) em Marketing (18,7% da amostra), Design (15,7%), Gestão (10,4%) e Tecnologias da Informação e Comunicação (7,9%). Pode-se concluir que são as quatro áreas fundamentais, ou pelo menos a necessidade existe em ter conhecimento nestas quatro áreas atualmente. Devido à [pequena] dimensão das empresas, um colaborador que tenha estas quatro competências tem uma vantagem competitiva face aos restantes, no que toca a hard skills.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As the complexity of parallel applications increase, the performance limitations resulting from computational load imbalance become dominant. Mapping the problem space to the processors in a parallel machine in a manner that balances the workload of each processors will typically reduce the run-time. In many cases the computation time required for a given calculation cannot be predetermined even at run-time and so static partition of the problem returns poor performance. For problems in which the computational load across the discretisation is dynamic and inhomogeneous, for example multi-physics problems involving fluid and solid mechanics with phase changes, the workload for a static subdomain will change over the course of a computation and cannot be estimated beforehand. For such applications the mapping of loads to process is required to change dynamically, at run-time in order to maintain reasonable efficiency. The issue of dynamic load balancing are examined in the context of PHYSICA, a three dimensional unstructured mesh multi-physics continuum mechanics computational modelling code.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Patterns of cognitive change over micro-longitudinal timescales (i.e., ranging from hours to days) are associated with a wide range of age-related health and functional outcomes. However, practical issues with conducting high-frequency assessments make investigations of micro-longitudinal cognition costly and burdensome to run. One way of addressing this is to develop cognitive assessments that can be performed by older adults, in their own homes, without a researcher being present. Here, we address the question of whether reliable and valid cognitive data can be collected over micro-longitudinal timescales using unsupervised cognitive tests.In study 1, 48 older adults completed two touchscreen cognitive tests, on three occasions, in controlled conditions, alongside a battery of standard tests of cognitive functions. In study 2, 40 older adults completed the same two computerized tasks on multiple occasions, over three separate week-long periods, in their own homes, without a researcher present. Here, the tasks were incorporated into a wider touchscreen system (Novel Assessment of Nutrition and Ageing (NANA)) developed to assess multiple domains of health and behavior. Standard tests of cognitive function were also administered prior to participants using the NANA system.Performance on the two “NANA” cognitive tasks showed convergent validity with, and similar levels of reliability to, the standard cognitive battery in both studies. Completion and accuracy rates were also very high. These results show that reliable and valid cognitive data can be collected from older adults using unsupervised computerized tests, thus affording new opportunities for the investigation of cognitive function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Several environmental stressors can impact the physiology and survival of fishes. Fish experience natural fluctuations in temperature and dissolved oxygen, but variations in these parameters due to anthropogenic sources are typically greater in magnitude and duration. Changes in temperature and oxygen of anthropogenic origins may therefore have larger negative impacts on fish than those occurring during natural events. Physiological parameters are sensitive indicators of the impacts of stressors by providing insight into the manner in which fish are disturbed by the stressor. Fish may display cumulative physiological responses to successive stressors, but the concept of synergy among multiple thermal stressors is poorly understood. Further, some fish species can be subjected to competitive angling events, which expose fish to an array of additional stressors that can increase mortality. The impacts of these events may change over seasons as fish display seasonal changes in behavior and physiology. Latitudinal origin may also affect the physiological response and mortality of fish exposed to common environmental stressors as individual populations are adapted to local environmental conditions. This thesis focuses on addressing these potential impacts on physiological parameters and mortality of largemouth bass (Micropterus salmoides) and provides implications for management and conservation. Largemouth bass were relatively robust to abrupt changes in temperature and oxygen, but were perturbed from physiological homeostasis during large (12°C) temperature shocks and low (< 4 mg O2/L) levels of dissolved oxygen. Cumulative physiological impacts of multiple cold shocks were only slightly greater than the disturbances sustained during a single cold shock, suggesting largemouth bass are able to tolerate successive thermal stressors. Largemouth bass exhibited seasonal changes in physiological parameters but the responses of fish to angling tournaments were relatively similar across seasons when compared with seasonal controls. Mortality was low during angling tournaments held during four seasons and no apparent seasonal trends were observed. Lastly, largemouth bass from two latitudinally separated populations exhibited differences in their physiological responses to acute cold stressors and overwinter mortality, characterized by greater mortality and physiological disturbances of southern fish than northern fish. Knowledge gained from this study can be used to make management and conservation decisions regarding a host of environmental factors and provides insight into the mechanisms by which fish species can persist over large latitudinal ranges.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed timevarying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible realtime term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed time-varying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible real-time term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since identification that mutations in NOTCH3 are responsible for cerebral autosomal dominant arteriopathy with subcortical infarcts and leucoencephalopathy (CADASIL) in the early 1990s, there has been extensive characterisation of the clinical and radiological features of the disease. However therapeutic interventions remain elusive, partly due to a limited understanding of the vascular pathophysiology and how it leads to the development of strokes, cognitive decline and disability. The apparent rarity and heterogenous natural history of CADASIL potentially make conducting any longitudinal or therapeutic trials difficult. The role of disease biomarkers is therefore of some interest. This thesis focuses on vascular function in CADASIL and how it may relate to clinical and radiological markers of disease. Establishing the prevalence of CADASIL in the West of Scotland was important to assess the impact of the disease, and how feasible a trial would be. A mutation prevalence of 10.7 per 100,000 was demonstrated, suggesting significant under diagnosis of the disease across much of Scotland. Cerebral hypoperfusion is thought to be important in CADASIL, and it has been shown that vascular abnormalities precede the development of brain pathology in mouse models. Investigation of vascular function in patients, both in the brain and systemically, requires less invasive measures. Arterial spin labelling magnetic resonance imaging (MRI) and transcranial Doppler ultrasound (TCD) can both be used to obtain non-invasive and quantifiable indices of vascular function. Monitoring patients with MRI whilst they receive different concentrations of inspired oxygen and carbon dioxide can provide information on brain function, and I reviewed the practicalities of this technique in order to guide the design of the studies in this thesis. 22 CADASIL patients were recruited to a longitudinal study. Testing included peripheral vascular assessment, assessment of disability, neurological dysfunction, mood and cognition. A CO2 reactivity challenge during both TCD and arterial spin labelling MRI, and detailed MRI sequences were obtained. I was able to demonstrate that vasoreactivity was associated with the number of lacunes and brain atrophy, as were carotid intima-media thickness, vessel stiffness, and age. Patients with greater disability, higher depressive symptoms and poorer processing speed showed a tendency to worse cerebral vasoreactivity but numbers were small. This observation suggests vasoreactivity may have potential as a therapeutic target, or a biomarker. I then wished to establish if arterial spin labelling MRI was useful for assessing change in cerebral blood flow in CADASIL patients. Cortical grey matter showed the highest blood flow, mean (SD), 55 (10) ml/100g/min and blood flow was significantly lower within hyperintensities (19 (4) ml/100g/min; p <0.001). Over one year, blood flow in both grey matter (mean -7 (10) %; p = 0.028) and deep white matter (-8 (13) %; p = 0.036) declined significantly. Cerebrovascular reactivity did not change over one year. I then investigated whether baseline vascular markers were able to predict change in radiological or neuropsychological measures of disease. Changes in brain volume, lacunes, microbleeds and normalised subcortical hyperintensity volume (increase of 0.8%) were shown over one year. Baseline vascular parameters were not able to predict these changes, or those in neuropsychological testing. NOTCH3 is found throughout the body and a systemic vasculopathy has been seen particularly affecting resistance vessels. Gluteal biopsies were obtained from 20 CADASIL patients, and ex vivo myography investigated the response to vasoactive agents. Evidence of impairment in both vasodilation and vasoconstriction was shown. The addition of antioxidants improved endothelium-dependent relaxation, indicating a role for oxidative stress in CADASIL pathology. Myography measures were not related to in vivo measures in the sub-group of patients who had taken part in both studies. The small vessels affected in CADASIL are unable to be imaged by conventional MR imaging so I aimed to establish which vessels might be responsible for lacunes with use of a microangiographic template overlaid onto brain images registered to a standard brain template. This showed most lacunes are small and associated with tertiary arterioles. On the basis of this thesis, it is concluded that vascular dysfunction plays an important role in the pathophysiology of CADASIL, and further assessment of vascular measures in longitudinal studies is needed. Arterial spin labelling MRI should be used as it is a reliable, non-invasive modality that can measure change over one year. Furthermore conventional cardiovascular risk factor prevention should be undertaken in CADASIL patients to delay the deleterious effects of the disease.