891 resultados para Wilcoxon two-sample test
Resumo:
The current study is a longitudinal investigation into changes in the division of household labour across transitions to marriage and parenthood in the UK. Previous research has noted a more traditional division of household labour, with women performing the majority of housework, amongst spouses and couples with children. However, the bulk of this work has been cross-sectional in nature. The few longitudinal studies that have been carried out have been rather ambiguous about the effect of marriage and parenthood on the division of housework. Theoretically, this study draws on gender construction theory. The key premise of this theory is that gender is something that is performed and created in interaction, and, as a result, something fluid and flexible rather than fixed and stable. The idea that couples ‘do gender’ through housework has been a major theoretical breakthrough. Gender-neutral explanations of the division of household labour, positing rational acting individuals, have failed to explicate why women continue to perform an unequal share of housework, regardless of socio-economic status. Contrastingly, gender construction theory situates gender as the key process in dividing household labour. By performing and avoiding certain housework chores, couples fulfill social norms of what it means to be a man and a woman although, given the emphasis on human agency in producing and contesting gender, couples are able to negotiate alternative gender roles which, in turn, feed back into the structure of social norms in an ever-changing societal landscape. This study adds extra depth to the doing gender approach by testing whether or not couples negotiate specific conjugal and parent roles in terms of the division of household labour. Both transitions hypothesise a more traditional division of household labour. Data comes from the British Household Panel Survey, a large, nationally representative quantitative survey that has been carried out annually since 1991. Here, data tracks the same 776 couples at two separate time points – 1996 and 2005. OLS regression is used to test whether or not transitions to marriage and parenthood have a significant impact on the division of household labour whilst controlling for host of relevant socio-economic factors. Results indicate that marriage has no significant effect on how couples partition housework. Those couples making the transition from cohabitation to marriage do not show significant changes in housework arrangements from those couples who remain cohabiting in both waves. On the other hand, becoming parents does lead to a more traditional division of household labour whilst controlling for socio-economic factors which accompany the move to parenthood. There is then some evidence that couples use the site of household labour to ‘do parenthood’ and generate identities which both use and inform socially prescribed notions of what it means to be a mother and a father. Support for socio-economic explanations of the division of household labour was mixed although it remains clear that they, alone, cannot explain how households divide housework.
Resumo:
Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches. Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine. Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors. Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.
Resumo:
First, in Essay 1, we test whether it is possible to forecast Finnish Options Index return volatility by examining the out-of-sample predictive ability of several common volatility models with alternative well-known methods; and find additional evidence for the predictability of volatility and for the superiority of the more complicated models over the simpler ones. Secondly, in Essay 2, the aggregated volatility of stocks listed on the Helsinki Stock Exchange is decomposed into a market, industry-and firm-level component, and it is found that firm-level (i.e., idiosyncratic) volatility has increased in time, is more substantial than the two former, predicts GDP growth, moves countercyclically and as well as the other components is persistent. Thirdly, in Essay 3, we are among the first in the literature to seek for firm-specific determinants of idiosyncratic volatility in a multivariate setting, and find for the cross-section of stocks listed on the Helsinki Stock Exchange that industrial focus, trading volume, and block ownership, are positively associated with idiosyncratic volatility estimates––obtained from both the CAPM and the Fama and French three-factor model with local and international benchmark portfolios––whereas a negative relation holds between firm age as well as size and idiosyncratic volatility.
Resumo:
In the thesis we consider inference for cointegration in vector autoregressive (VAR) models. The thesis consists of an introduction and four papers. The first paper proposes a new test for cointegration in VAR models that is directly based on the eigenvalues of the least squares (LS) estimate of the autoregressive matrix. In the second paper we compare a small sample correction for the likelihood ratio (LR) test of cointegrating rank and the bootstrap. The simulation experiments show that the bootstrap works very well in practice and dominates the correction factor. The tests are applied to international stock prices data, and the .nite sample performance of the tests are investigated by simulating the data. The third paper studies the demand for money in Sweden 1970—2000 using the I(2) model. In the fourth paper we re-examine the evidence of cointegration between international stock prices. The paper shows that some of the previous empirical results can be explained by the small-sample bias and size distortion of Johansen’s LR tests for cointegration. In all papers we work with two data sets. The first data set is a Swedish money demand data set with observations on the money stock, the consumer price index, gross domestic product (GDP), the short-term interest rate and the long-term interest rate. The data are quarterly and the sample period is 1970(1)—2000(1). The second data set consists of month-end stock market index observations for Finland, France, Germany, Sweden, the United Kingdom and the United States from 1980(1) to 1997(2). Both data sets are typical of the sample sizes encountered in economic data, and the applications illustrate the usefulness of the models and tests discussed in the thesis.
Resumo:
The low predictive power of implied volatility in forecasting the subsequently realized volatility is a well-documented empirical puzzle. As suggested by e.g. Feinstein (1989), Jackwerth and Rubinstein (1996), and Bates (1997), we test whether unrealized expectations of jumps in volatility could explain this phenomenon. Our findings show that expectations of infrequently occurring jumps in volatility are indeed priced in implied volatility. This has two important consequences. First, implied volatility is actually expected to exceed realized volatility over long periods of time only to be greatly less than realized volatility during infrequently occurring periods of very high volatility. Second, the slope coefficient in the classic forecasting regression of realized volatility on implied volatility is very sensitive to the discrepancy between ex ante expected and ex post realized jump frequencies. If the in-sample frequency of positive volatility jumps is lower than ex ante assessed by the market, the classic regression test tends to reject the hypothesis of informational efficiency even if markets are informationally effective.
Resumo:
This paper presents results of triaxial compression tests on sand reinforced with different types of geosynthetics in different layer configurations to study the effect of quantity of reinforcement and tensile strength of the geosynthetic material on the mechanical behavior of geosynthetic-reinforced sand. The reinforcement types used are woven geotextile, geogrid, and polyester film. The layer configurations used are two, three, four, and eight horizontal reinforcing layers in a triaxial test sample. From the triaxial tests, it is found that the geosynthetic reinforcement imparts cohesive strength to otherwise cohesionless sand. The effect of reinforcement on the friction angle was found to be insignificant. The magnitude of imparted apparent cohesion is found to depend not only on the tensile strength of the geosynthetic material but also the surface roughness changes during loading. Special triaxial tests using rice flour as the reinforced medium, microscopic images, and surface roughness studies revealed the effect of indent formation on the surface of polyester film, which was the reason for the unusually high strength exhibited by the sand reinforced with polyester film.
Resumo:
A novel system for recognition of handprinted alphanumeric characters has been developed and tested. The system can be employed for recognition of either the alphabet or the numeral by contextually switching on to the corresponding branch of the recognition algorithm. The two major components of the system are the multistage feature extractor and the decision logic tree-type catagorizer. The importance of ldquogoodrdquo features over sophistication in the classification procedures was recognized, and the feature extractor is designed to extract features based on a variety of topological, morphological and similar properties. An information feedback path is provided between the decision logic and the feature extractor units to facilitate an interleaved or recursive mode of operation. This ensures that only those features essential to the recognition of a particular sample are extracted each time. Test implementation has demonstrated the reliability of the system in recognizing a variety of handprinted alphanumeric characters with close to 100% accuracy.
Resumo:
The growing interest for sequencing with higher throughput in the last decade has led to the development of new sequencing applications. This thesis concentrates on optimizing DNA library preparation for Illumina Genome Analyzer II sequencer. The library preparation steps that were optimized include fragmentation, PCR purification and quantification. DNA fragmentation was performed with focused sonication in different concentrations and durations. Two column based PCR purification method, gel matrix method and magnetic bead based method were compared. Quantitative PCR and gel electrophoresis in a chip were compared for DNA quantification. The magnetic bead purification was found to be the most efficient and flexible purification method. The fragmentation protocol was changed to produce longer fragments to be compatible with longer sequencing reads. Quantitative PCR correlates better with the cluster number and should thus be considered to be the default quantification method for sequencing. As a result of this study more data have been acquired from sequencing with lower costs and troubleshooting has become easier as qualification steps have been added to the protocol. New sequencing instruments and applications will create a demand for further optimizations in future.
Resumo:
Gene expression is one of the most critical factors influencing the phenotype of a cell. As a result of several technological advances, measuring gene expression levels has become one of the most common molecular biological measurements to study the behaviour of cells. The scientific community has produced enormous and constantly increasing collection of gene expression data from various human cells both from healthy and pathological conditions. However, while each of these studies is informative and enlighting in its own context and research setup, diverging methods and terminologies make it very challenging to integrate existing gene expression data to a more comprehensive view of human transcriptome function. On the other hand, bioinformatic science advances only through data integration and synthesis. The aim of this study was to develop biological and mathematical methods to overcome these challenges and to construct an integrated database of human transcriptome as well as to demonstrate its usage. Methods developed in this study can be divided in two distinct parts. First, the biological and medical annotation of the existing gene expression measurements needed to be encoded by systematic vocabularies. There was no single existing biomedical ontology or vocabulary suitable for this purpose. Thus, new annotation terminology was developed as a part of this work. Second part was to develop mathematical methods correcting the noise and systematic differences/errors in the data caused by various array generations. Additionally, there was a need to develop suitable computational methods for sample collection and archiving, unique sample identification, database structures, data retrieval and visualization. Bioinformatic methods were developed to analyze gene expression levels and putative functional associations of human genes by using the integrated gene expression data. Also a method to interpret individual gene expression profiles across all the healthy and pathological tissues of the reference database was developed. As a result of this work 9783 human gene expression samples measured by Affymetrix microarrays were integrated to form a unique human transcriptome resource GeneSapiens. This makes it possible to analyse expression levels of 17330 genes across 175 types of healthy and pathological human tissues. Application of this resource to interpret individual gene expression measurements allowed identification of tissue of origin with 92.0% accuracy among 44 healthy tissue types. Systematic analysis of transcriptional activity levels of 459 kinase genes was performed across 44 healthy and 55 pathological tissue types and a genome wide analysis of kinase gene co-expression networks was done. This analysis revealed biologically and medically interesting data on putative kinase gene functions in health and disease. Finally, we developed a method for alignment of gene expression profiles (AGEP) to perform analysis for individual patient samples to pinpoint gene- and pathway-specific changes in the test sample in relation to the reference transcriptome database. We also showed how large-scale gene expression data resources can be used to quantitatively characterize changes in the transcriptomic program of differentiating stem cells. Taken together, these studies indicate the power of systematic bioinformatic analyses to infer biological and medical insights from existing published datasets as well as to facilitate the interpretation of new molecular profiling data from individual patients.
Resumo:
We present a simplified theoretical formulation of the thermoelectric power (TP) under magnetic quantization in quantum wells (QWs) of nonlinear optical materials on the basis of a newly formulated magneto-dispersion law. We consider the anisotropies in the effective electron masses and the spin-orbit constants within the framework of k.p formalism by incorporating the influence of the crystal field splitting. The corresponding results for III-V materials form a special case of our generalized analysis under certain limiting conditions. The TP in QWs of Bismuth, II-VI, IV-VI and stressed materials has been studied by formulating appropriate electron magneto-dispersion laws. We also address the fact that the TP exhibits composite oscillations with a varying quantizing magnetic field in QWs of n-Cd3As2, n-CdGeAs2, n-InSb, p-CdS, stressed InSb, PbTe and Bismuth. This reflects the combined signatures of magnetic and spatial quantizations of the carriers in such structures. The TP also decreases with increasing electron statistics and under the condition of non-degeneracy, all the results as derived in this paper get transformed into the well-known classical equation of TP and thus confirming the compatibility test. We have also suggested an experimental method of determining the elastic constants in such systems with arbitrary carrier energy spectra from the known value of the TP. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Fly ash is a waste by-product obtained from the burning of coal by thermal power plants for generating electricity. When bulk quantities are involved, in order to arrest the fugitive dust, it is stored wet rather than dry. Fly ash contains trace concentrations of heavy metals and other substances in sufficient quantities to be able to leach out over a period of time. In this study an attempt was made to study the leachabilities of a few selected trace metals: Cd, Cu, Cr, Mn, Pb and Zn from two different types of class F fly ashes. Emphasis is also laid on developing an alternative in order to arrest the relative leachabilities of heavy metals after amending them with suitable additives. A standard laboratory leaching test for combustion residues has been employed to study the leachabilities of these trace elements as a function of liquid to solid ratio and pH. The leachability tests were conducted on powdered fly ash samples before and after amending them suitably with the matrices lime and gypsum; they were compacted to their respective proctor densities and cured for periods of 28 and 180 days. A marked reduction in the relative leachabilities of the trace elements was observed to be present at the end of 28 days. These relative leachability values further reduced marginally when tests were performed at the end of 180 days.
Resumo:
Conventional Random access scan (RAS) for testing has lower test application time, low power dissipation, and low test data volume compared to standard serial scan chain based design In this paper, we present two cluster based techniques, namely, Serial Input Random Access Scan and Variable Word Length Random Access Scan to reduce test application time even further by exploiting the parallelism among the clusters and performing write operations on multiple bits Experimental results on benchmarks circuits show on an average 2-3 times speed up in test write time and average 60% reduction in write test data volume
Resumo:
The current study is a longitudinal investigation into changes in the division of household labour across transitions to marriage and parenthood in the UK. Previous research has noted a more traditional division of household labour, with women performing the majority of housework, amongst spouses and couples with children. However, the bulk of this work has been cross-sectional in nature. The few longitudinal studies that have been carried out have been rather ambiguous about the effect of marriage and parenthood on the division of housework. Theoretically, this study draws on gender construction theory. The key premise of this theory is that gender is something that is performed and created in interaction, and, as a result, something fluid and flexible rather than fixed and stable. The idea that couples 'do gender' through housework has been a major theoretical breakthrough. Gender-neutral explanations of the division of household labour, positing rational acting individuals, have failed to explicate why women continue to perform an unequal share of housework, regardless of socioeconomic status. Contrastingly, gender construction theory situates gender as the key process in dividing household labour. By performing and avoiding certain housework chores, couples fulfill social norms of what it means to be a man and a woman although, given the emphasis on human agency in producing and contesting gender, couples are able to negotiate alternative gender roles which, in turn, feed back into the structure of social norms in an ever-changing societal landscape. This study adds extra depth to the doing gender approach by testing whether or not couples negotiate specific conjugal and parent roles in terms of the division of household labour. Both transitions hypothesise a more traditional division of household labour. Data comes from the British Household Panel Survey, a large, nationally representative quantitative survey that has been carried out annually since 1991. Here, data tracks the same 776 couples at two separate time points - 1996 and 2005. OLS regression is used to test whether or not transitions to marriage and parenthood have a significant impact on the division of household labour whilst controlling for host of relevant socio-economic factors. Results indicate that marriage has no significant effect on how couples partition housework. Those couples making the transition from cohabitation to marriage do not show significant changes in housework arrangements from those couples who remain cohabiting in both waves. On the other hand, becoming parents does lead to a more traditional division of household labour whilst controlling for socio-economic factors which accompany the move to parenthood. There is then some evidence that couples use the site of household labour to 'do parenthood' and generate identities which both use and inform socially prescribed notions of what it means to be a mother and a father. Support for socio-economic explanations of the division of household labour was mixed although it remains clear that they, alone, cannot explain how households divide housework.
Resumo:
Six models (Simulators) are formulated and developed with all possible combinations of pressure and saturation of the phases as primary variables. A comparative study between six simulators with two numerical methods, conventional simultaneous and modified sequential methods are carried out. The results of the numerical models are compared with the laboratory experimental results to study the accuracy of the model especially in heterogeneous porous media. From the study it is observed that the simulator using pressure and saturation of the wetting fluid (PW, SW formulation) is the best among the models tested. Many simulators with nonwetting phase as one of the primary variables did not converge when used along with simultaneous method. Based on simulator 1 (PW, SW formulation), a comparison of different solution methods such as simultaneous method, modified sequential and adaptive solution modified sequential method are carried out on 4 test problems including heterogeneous and randomly heterogeneous problems. It is found that the modified sequential and adaptive solution modified sequential methods could save the memory by half and as also the CPU time required by these methods is very less when compared with that using simultaneous method. It is also found that the simulator with PNW and PW as the primary variable which had problem of convergence using the simultaneous method, converged using both the modified sequential method and also using adaptive solution modified sequential method. The present study indicates that pressure and saturation formulation along with adaptive solution modified sequential method is the best among the different simulators and methods tested.
Resumo:
Test results of 24 reinforced concrete wall panels in two-way action (i.e., supported on all the four sides) and subjected to in-plane vertical load are presented. The load is applied at an eccentricity to represent possible accidental eccentricity that occurs in practice due to constructional imperfections. Influences of aspect ratio, thinness ratio, slendemess ratio, vertical steel, and horizontal steel on the ultimate load are studied. Two equations are proposed to predict the ultimate load carried by the panels. The first equation is empirical and is arrived at from trial and error fitting with test data. The second equation is semi-empirical and is developed from a modification of the buckling strength of thin rectangular plates. Both the equations are formulated so as to give a safe prediction of a large portion of ultimate strength test results. Also, ultimate load cracking load and lateral deflections of identical panels in two-way action (all four sides supported) and oneway action (top and bottom sides only supported) are compared.