963 resultados para variance and coherence


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The publication represents a multi-dimensional and multi-faced, in depth assessment of the most significant determinants of the EU development as a political, economic and legal entity, in its format emerging from the Lisbon Treaty. The book represents an important contribution to our understanding of the most profound issues in the recent process of EU integration, including the issue of maintaining its cohesion and coherence under the stress of global challanges faced also by the European Union. Autohors formulated worthwhile conclusions of high value not only for academics but also for political decision-makers, which gives the book same competitive edge over its more theoretical and, hence, less practice-oriented, knack. The arumentation presented in the book would not be left without a reaction of the academic and/or professional circles. I take it almost for granted that the overall setting of the argumentation presented in it, as well as specific points made in its various chapters would find their adequate resonance in a high profile discussion likely to emerge after the book would have been published.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dissertation examined prekindergarten teachers' perceptions of their supervisory relationship with their educational specialist, and the effect of the prekindergarten teachers' perceptions on the quality of the High/Scope prekindergarten program. The High/Scope educational specialists use their leader power bases (reward, coercive, legitimate, referent, expert and informational) to influence teachers' perceptions of satisfaction and compliance, as well as teachers' actual compliance with the High/Scope prekindergarten program standards. The correlational relationships between the variables were examined using Analysis of Variance and Multivariate Analysis of Variance. Path Analysis was utilized to analyze variables to determine the validity of the correlational model. Expert, legitimate, referent, and informational power bases of the High/Scope educational specialist were found to be the most influential on attitudinal and behavioral compliance of teachers. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation examined the efficacy of family cognitive behavior treatment (FCBT) and group cognitive behavior treatment (GBCT) for reducing anxiety disorders in children and adolescents using several approaches: clinical significant change, equivalence testing, and analyses of variance. It also examined treatment specificity in terms of targeting family/parents (in FCBT) and peers/group (in GCBT) contextual variables using two main approaches: analyses of variance and structural equation modeling (SEM). The sample consisted of 143 children and their parents who presented to the Child Anxiety and Phobia Program housed within the Child and Family Psychosocial Research Center at Florida International University. Diagnostic interviews and questionnaires were administered to assess youth anxiety. Questionnaires were administered to assess child and parent views of family/parents and peers/group contextual variables. In terms of clinical significant change, results indicated that 84.6% of youth in FCBT and 71.2% of youth in GBCT no longer met diagnostic criteria for their primary/targeted anxiety disorder. In addition, results from analyses of variance indicated that FCBT and GCBT were both efficacious in reducing anxiety disorders in youth across both child and parent ratings. Results using both analyses of variance and structural equation modeling also indicated that there was no meaningful treatment specificity between FCBT and GCBT in terms of either family/parents or peers/group contextual variables. That is, child social skills improved in GCBT in which these skills were targeted and in FCBT in which these skills were not targeted; parenting skills improved in FCBT in which these skills were targeted and in GCBT in which these skills were not targeted. Clinical implications and future research recommendations are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation consists of three separate essays on job search and labor market dynamics. In the first essay, “The Impact of Labor Market Conditions on Job Creation: Evidence from Firm Level Data”, I study how much changes in labor market conditions reduce employment fluctuations over the business cycle. Changes in labor market conditions make hiring more expensive during expansions and cheaper during recessions, creating counter-cyclical incentives for job creation. I estimate firm level elasticities of labor demand with respect to changes in labor market conditions, considering two margins: changes in labor market tightness and changes in wages. Using employer-employee matched data from Brazil, I find that all firms are more sensitive to changes in wages rather than labor market tightness, and there is substantial heterogeneity in labor demand elasticity across regions. Based on these results, I demonstrate that changes in labor market conditions reduce the variance of employment growth over the business cycle by 20% in a median region, and this effect is equally driven by changes along each margin. Moreover, I show that the magnitude of the effect of labor market conditions on employment growth can be significantly affected by economic policy. In particular, I document that the rapid growth of the national minimum wages in Brazil in 1997-2010 amplified the impact of the change in labor market conditions during local expansions and diminished this impact during local recessions.

In the second essay, “A Framework for Estimating Persistence of Local Labor

Demand Shocks”, I propose a decomposition which allows me to study the persistence of local labor demand shocks. Persistence of labor demand shocks varies across industries, and the incidence of shocks in a region depends on the regional industrial composition. As a result, less diverse regions are more likely to experience deeper shocks, but not necessarily more long lasting shocks. Building on this idea, I propose a decomposition of local labor demand shocks into idiosyncratic location shocks and nationwide industry shocks and estimate the variance and the persistence of these shocks using the Quarterly Census of Employment and Wages (QCEW) in 1990-2013.

In the third essay, “Conditional Choice Probability Estimation of Continuous- Time Job Search Models”, co-authored with Peter Arcidiacono and Arnaud Maurel, we propose a novel, computationally feasible method of estimating non-stationary job search models. Non-stationary job search models arise in many applications, where policy change can be anticipated by the workers. The most prominent example of such policy is the expiration of unemployment benefits. However, estimating these models still poses a considerable computational challenge, because of the need to solve a differential equation numerically at each step of the optimization routine. We overcome this challenge by adopting conditional choice probability methods, widely used in dynamic discrete choice literature, to job search models and show how the hazard rate out of unemployment and the distribution of the accepted wages, which can be estimated in many datasets, can be used to infer the value of unemployment. We demonstrate how to apply our method by analyzing the effect of the unemployment benefit expiration on duration of unemployment using the data from the Survey of Income and Program Participation (SIPP) in 1996-2007.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The amount and quality of available biomass is a key factor for the sustainable livestock industry and agricultural management related decision making. Globally 31.5% of land cover is grassland while 80% of Ireland’s agricultural land is grassland. In Ireland, grasslands are intensively managed and provide the cheapest feed source for animals. This dissertation presents a detailed state of the art review of satellite remote sensing of grasslands, and the potential application of optical (Moderate–resolution Imaging Spectroradiometer (MODIS)) and radar (TerraSAR-X) time series imagery to estimate the grassland biomass at two study sites (Moorepark and Grange) in the Republic of Ireland using both statistical and state of the art machine learning algorithms. High quality weather data available from the on-site weather station was also used to calculate the Growing Degree Days (GDD) for Grange to determine the impact of ancillary data on biomass estimation. In situ and satellite data covering 12 years for the Moorepark and 6 years for the Grange study sites were used to predict grassland biomass using multiple linear regression, Neuro Fuzzy Inference Systems (ANFIS) models. The results demonstrate that a dense (8-day composite) MODIS image time series, along with high quality in situ data, can be used to retrieve grassland biomass with high performance (R2 = 0:86; p < 0:05, RMSE = 11.07 for Moorepark). The model for Grange was modified to evaluate the synergistic use of vegetation indices derived from remote sensing time series and accumulated GDD information. As GDD is strongly linked to the plant development, or phonological stage, an improvement in biomass estimation would be expected. It was observed that using the ANFIS model the biomass estimation accuracy increased from R2 = 0:76 (p < 0:05) to R2 = 0:81 (p < 0:05) and the root mean square error was reduced by 2.72%. The work on the application of optical remote sensing was further developed using a TerraSAR-X Staring Spotlight mode time series over the Moorepark study site to explore the extent to which very high resolution Synthetic Aperture Radar (SAR) data of interferometrically coherent paddocks can be exploited to retrieve grassland biophysical parameters. After filtering out the non-coherent plots it is demonstrated that interferometric coherence can be used to retrieve grassland biophysical parameters (i. e., height, biomass), and that it is possible to detect changes due to the grass growth, and grazing and mowing events, when the temporal baseline is short (11 days). However, it not possible to automatically uniquely identify the cause of these changes based only on the SAR backscatter and coherence, due to the ambiguity caused by tall grass laid down due to the wind. Overall, the work presented in this dissertation has demonstrated the potential of dense remote sensing and weather data time series to predict grassland biomass using machine-learning algorithms, where high quality ground data were used for training. At present a major limitation for national scale biomass retrieval is the lack of spatial and temporal ground samples, which can be partially resolved by minor modifications in the existing PastureBaseIreland database by adding the location and extent ofeach grassland paddock in the database. As far as remote sensing data requirements are concerned, MODIS is useful for large scale evaluation but due to its coarse resolution it is not possible to detect the variations within the fields and between the fields at the farm scale. However, this issue will be resolved in terms of spatial resolution by the Sentinel-2 mission, and when both satellites (Sentinel-2A and Sentinel-2B) are operational the revisit time will reduce to 5 days, which together with Landsat-8, should enable sufficient cloud-free data for operational biomass estimation at a national scale. The Synthetic Aperture Radar Interferometry (InSAR) approach is feasible if there are enough coherent interferometric pairs available, however this is difficult to achieve due to the temporal decorrelation of the signal. For repeat-pass InSAR over a vegetated area even an 11 days temporal baseline is too large. In order to achieve better coherence a very high resolution is required at the cost of spatial coverage, which limits its scope for use in an operational context at a national scale. Future InSAR missions with pair acquisition in Tandem mode will minimize the temporal decorrelation over vegetation areas for more focused studies. The proposed approach complements the current paradigm of Big Data in Earth Observation, and illustrates the feasibility of integrating data from multiple sources. In future, this framework can be used to build an operational decision support system for retrieval of grassland biophysical parameters based on data from long term planned optical missions (e. g., Landsat, Sentinel) that will ensure the continuity of data acquisition. Similarly, Spanish X-band PAZ and TerraSAR-X2 missions will ensure the continuity of TerraSAR-X and COSMO-SkyMed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Four pedons on each of four drift sheets in the Lake Wellman area of the Darwin Mountains were sampled for chemical and microbial analyses. The four drifts, Hatherton, Britannia, Danum, and Isca, ranged from early Holocene (10 ka) to mid-Quaternary (c. 900 ka). The soil properties of weathering stage, salt stage, and depths of staining, visible salts, ghosts, and coherence increase with drift age. The landforms contain primarily high-centred polygons with windblown snow in the troughs. The soils are dominantly complexes of Typic Haplorthels and Typic Haploturbels. The soils were dry and alkaline with low levels of organic carbon, nitrogen and phosphorus. Electrical conductivity was high accompanied by high levels of water soluble anions and cations (especially calcium and sulphate in older soils). Soil microbial biomass, measured as phospholipid fatty acids, and numbers of culturable heterotrophic microbes, were low, with highest levels detected in less developed soils from the Hatherton drift. The microbial community structure of the Hatherton soil also differed from that of the Britannia, Danum and Isca soils. Ordination revealed the soil microbial community structure was influenced by soil development and organic carbon.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Electron beam lithography (EBL) and focused ion beam (FIB) methods were developed in house to fabricate nanocrystalline nickel micro/nanopillars so to compare the effect of fabrication on plastic yielding. EBL was used to fabricate 3 μm and 5 μm thick poly-methyl methacrylate patterned substrates in which nickel pillars were grown by electroplating with height to diameter aspect ratios from 2:1 to 5:1. FIB milling was used to reduce larger grown pillars to sizes similar to EBL grown pillars. X-ray diffraction, electron back-scatter diffraction, scanning electron microscopy, and FIB imaging were used to characterize the nickel pillars. The measured grain size of the pillars was 91±23 nm, with strong <110> and weaker <111> and <110> crystallographic texture in the growth. Load-controlled compression tests were conducted using a MicroMaterials nano-indenter equipped with a 10 μm flat punch at constant rates from 0.0015 to 0.03 mN/s on EBL grown pillars, and 0.0015 and 0.015 mN/s on FIB-milled pillars. The measured Young’s modulus ranged from 55 to 350 GPa for all pillars, agreeing with values in the literature. EBL grown pillars exhibited stochastic strain-bursts at slow loading rates, attributed to local micro yield events, followed by work hardening. Sharp yield points were also observed and attributed to the gold seed layer de-bonding between the nickel pillar and substrate due to the shear stress associated with end effects that arise from the substrate constraint. The onset of yield ranged from 108 to 1800 MPa, which is greater than bulk nickel, but within values given in the literature. FIB-milled pillars demonstrated stochastic yield behaviour at all loading rates tested, yielding between 320 and 625 MPa. Deformation was apparent at FIB-milled pillar tops, where the smallest cross-sectional area was measured, but still exhibited superior yield strength to bulk nickel. The gallium damage at the outer surface of the pillars likely aids in dislocation nucleation and plasticity, leading to lower yield strengths than for the EBL pillars. Thermal drift, substrate effects, and noise due to vibrations within the indenter system contributed to variance and inconsistency in the data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The domestication of plants and animals marks one of the most significant transitions in human, and indeed global, history. Traditionally, study of the domestication process was the exclusive domain of archaeologists and agricultural scientists; today it is an increasingly multidisciplinary enterprise that has come to involve the skills of evolutionary biologists and geneticists. Although the application of new information sources and methodologies has dramatically transformed our ability to study and understand domestication, it has also generated increasingly large and complex datasets, the interpretation of which is not straightforward. In particular, challenges of equifinality, evolutionary variance, and emergence of unexpected or counter-intuitive patterns all face researchers attempting to infer past processes directly from patterns in data. We argue that explicit modeling approaches, drawing upon emerging methodologies in statistics and population genetics, provide a powerful means of addressing these limitations. Modeling also offers an approach to analyzing datasets that avoids conclusions steered by implicit biases, and makes possible the formal integration of different data types. Here we outline some of the modeling approaches most relevant to current problems in domestication research, and demonstrate the ways in which simulation modeling is beginning to reshape our understanding of the domestication process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sufficiently complex set of molecules, if subject to perturbation, will self-organise and show emergent behaviour. If such a system can take on information it will become subject to natural selection. This could explain how self-replicating molecules evolved into life and how intelligence arose. A pivotal step in this evolutionary process was of course the emergence of the eukaryote and the advent of the mitochondrion, which both enhanced energy production per cell and increased the ability to process, store and utilise information. Recent research suggest that from its inception life embraced quantum effects such as “tunnelling” andcoherence” while competition and stressful conditions provided a constant driver for natural selection. We believe that the biphasic adaptive response to stress described by hormesis – a process that captures information to enable adaptability, is central to this whole process. Critically, hormesis could improve mitochondrial quantum efficiency, improving the ATP/ROS ratio, while inflammation, which is tightly associated with the aging process, might do the opposite. This all suggests that to achieve optimal health and healthy ageing, one has to sufficiently stress the system to ensure peak mitochondrial function, which itself could reflect selection of optimum efficiency at the quantum level.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Deeply conflicting views on the political situation of Judaea under the Roman prefects (6-41 c.e.) have been offered. According to some scholars, this was a period of persistent political unrest and agitation, whilst according to a widespread view it was a quiescent period of political calm (reflected in Tacitus’ phrase sub Tiberio quies). The present article critically examines again the main available sources –particularly Josephus, the canonical Gospels and Tacitus– in order to offer a more reliable historical reconstruction. The conclusions drawn by this survey calls into question some widespread and insufficiently nuanced views on the period. This, in turn, allows a reflection on the non-epistemic factors which might contribute to explain the origin of such views.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An increasing number of people with terminal cancer are being cared for at home, often by their partner. This study explores the identity, experiences and relationships of people caring for their partner at the end of life and how they construct their experience through personal and couple narratives. It draws upon dialogical approaches to narrative analysis to focus on caring partners and the care relationship. Six participants were recruited for the study. Two methods of data collection are used: narrative interviews and journals. Following individual case analysis, two methods of cross-narrative analysis are used: an analysis of narrative themes and an identification of narrative types. The key findings can be summarised as follows. First, in the period since their partner's terminal prognosis, participants sustained and reconstructed self and couple relationship narratives. These narratives aided the construction of meaning and coherence at a time of major biographical disruption: the anticipated loss of a partner. Second, the study highlights the complexity of spoken and unspoken narratives in terminal cancer and how these relate to individual and couple identities. Third, a typology of archetypal narratives based upon the data is identified. The blow-by-blow narratives illustrate how participants sought to construct coherence and meaning in the illness story, while champion and resilience narratives demonstrate how participants utilised positive self and relational narratives to manage a time of biographical disruption. The study highlights how this narrative approach can enhance understanding of the experiences and identities of people caring for a terminally ill partner.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cancer and cardio-vascular diseases are the leading causes of death world-wide. Caused by systemic genetic and molecular disruptions in cells, these disorders are the manifestation of profound disturbance of normal cellular homeostasis. People suffering or at high risk for these disorders need early diagnosis and personalized therapeutic intervention. Successful implementation of such clinical measures can significantly improve global health. However, development of effective therapies is hindered by the challenges in identifying genetic and molecular determinants of the onset of diseases; and in cases where therapies already exist, the main challenge is to identify molecular determinants that drive resistance to the therapies. Due to the progress in sequencing technologies, the access to a large genome-wide biological data is now extended far beyond few experimental labs to the global research community. The unprecedented availability of the data has revolutionized the capabilities of computational researchers, enabling them to collaboratively address the long standing problems from many different perspectives. Likewise, this thesis tackles the two main public health related challenges using data driven approaches. Numerous association studies have been proposed to identify genomic variants that determine disease. However, their clinical utility remains limited due to their inability to distinguish causal variants from associated variants. In the presented thesis, we first propose a simple scheme that improves association studies in supervised fashion and has shown its applicability in identifying genomic regulatory variants associated with hypertension. Next, we propose a coupled Bayesian regression approach -- eQTeL, which leverages epigenetic data to estimate regulatory and gene interaction potential, and identifies combinations of regulatory genomic variants that explain the gene expression variance. On human heart data, eQTeL not only explains a significantly greater proportion of expression variance in samples, but also predicts gene expression more accurately than other methods. We demonstrate that eQTeL accurately detects causal regulatory SNPs by simulation, particularly those with small effect sizes. Using various functional data, we show that SNPs detected by eQTeL are enriched for allele-specific protein binding and histone modifications, which potentially disrupt binding of core cardiac transcription factors and are spatially proximal to their target. eQTeL SNPs capture a substantial proportion of genetic determinants of expression variance and we estimate that 58% of these SNPs are putatively causal. The challenge of identifying molecular determinants of cancer resistance so far could only be dealt with labor intensive and costly experimental studies, and in case of experimental drugs such studies are infeasible. Here we take a fundamentally different data driven approach to understand the evolving landscape of emerging resistance. We introduce a novel class of genetic interactions termed synthetic rescues (SR) in cancer, which denotes a functional interaction between two genes where a change in the activity of one vulnerable gene (which may be a target of a cancer drug) is lethal, but subsequently altered activity of its partner rescuer gene restores cell viability. Next we describe a comprehensive computational framework --termed INCISOR-- for identifying SR underlying cancer resistance. Applying INCISOR to mine The Cancer Genome Atlas (TCGA), a large collection of cancer patient data, we identified the first pan-cancer SR networks, composed of interactions common to many cancer types. We experimentally test and validate a subset of these interactions involving the master regulator gene mTOR. We find that rescuer genes become increasingly activated as breast cancer progresses, testifying to pervasive ongoing rescue processes. We show that SRs can be utilized to successfully predict patients' survival and response to the majority of current cancer drugs, and importantly, for predicting the emergence of drug resistance from the initial tumor biopsy. Our analysis suggests a potential new strategy for enhancing the effectiveness of existing cancer therapies by targeting their rescuer genes to counteract resistance. The thesis provides statistical frameworks that can harness ever increasing high throughput genomic data to address challenges in determining the molecular underpinnings of hypertension, cardiovascular disease and cancer resistance. We discover novel molecular mechanistic insights that will advance the progress in early disease prevention and personalized therapeutics. Our analyses sheds light on the fundamental biological understanding of gene regulation and interaction, and opens up exciting avenues of translational applications in risk prediction and therapeutics.