937 resultados para Text Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of Komendant's design of the Kimbell Art Museum was carried out in order to determine the effectiveness of the ring beams, edge beams and prestressing in the shells of the roof system. Finite element analysis was not available to Komendant or other engineers of the time to aid them in the design and analysis. Thus, the use of this tool helped to form a new perspective on the Kimbell Art Museum and analyze the engineer's work. In order to carry out the finite element analysis of Kimbell Art Museum, ADINA finite element analysis software was utilized. Eight finite element models (FEM-1 through FEM-8) of increasing complexity were created. The results of the most realistic model, FEM-8, which included ring beams, edge beams and prestressing, were compared to Komendant's calculations. The maximum deflection at the crown of the mid-span surface of -0.1739 in. in FEM-8 was found to be larger than Komendant's deflection in the design documents before the loss in prestressing force (-0.152 in.) but smaller than his prediction after the loss in prestressing force (-0.3814 in.). Komendant predicted a larger longitudinal stress of -903 psi at the crown (vs. -797 psi in FEM-8) and 37 psi at the edge (vs. -347 psi in FEM-8). Considering the strength of concrete of 5000 psi, the difference in results is not significant. From the analysis it was determined that both FEM-5, which included prestressing and fixed rings, and FEM-8 can be successfully and effectively implemented in practice. Prestressing was used in both models and thus served as the main contribution to efficiency. FEM-5 showed that ring and edge beams can be avoided, however an architect might find them more aesthetically appropriate than rigid walls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research project is to study an innovative method for the stability assessment of structural steel systems, namely the Modified Direct Analysis Method (MDM). This method is intended to simplify an existing design method, the Direct Analysis Method (DM), by assuming a sophisticated second-order elastic structural analysis will be employed that can account for member and system instability, and thereby allow the design process to be reduced to confirming the capacity of member cross-sections. This last check can be easily completed by substituting an effective length of KL = 0 into existing member design equations. This simplification will be particularly useful for structural systems in which it is not clear how to define the member slenderness L/r when the laterally unbraced length L is not apparent, such as arches and the compression chord of an unbraced truss. To study the feasibility and accuracy of this new method, a set of 12 benchmark steel structural systems previously designed and analyzed by former Bucknell graduate student Jose Martinez-Garcia and a single column were modeled and analyzed using the nonlinear structural analysis software MASTAN2. A series of Matlab-based programs were prepared by the author to provide the code checking requirements for investigating the MDM. By comparing MDM and DM results against the more advanced distributed plasticity analysis results, it is concluded that the stability of structural systems can be adequately assessed in most cases using MDM, and that MDM often appears to be a more accurate but less conservative method in assessing stability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A document analysis of institutional websites was conducted to infer the extent to which affiliated campuses are integrated with one another within multi-campus university systems. The factors that contribute to either a common or differentiated sense of institutional identity, as expressed in the campuses’ individual web presences, were a primary focus of the investigation. This study then sought to determine the effect that institutional identity has on the anticipatory socialization of students who relocate from branch campuses to their parent institutions. Once an analysis of the findings had been conducted, recommendations for further research in this area were made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary objective of this thesis is to demonstrate the pernicious impact that moral hierarchies have on our perception and subsequent treatment of non-human animals. Moral hierarchies in general are characterized by a dynamic in which one group is considered to be fundamentally superior to a lesser group. This thesis focuses specifically on the moral hierarchies that arise when humans are assumed to be superior to non-human animals in virtue of their advanced mental capabilities. The operative hypothesis of this thesis is essentially that moral hierarchies thwart the provision of justice to non-human animals in that they function as a justification for otherwise impermissible actions. When humans are assumed to be fundamentally superior to non-human animals then it becomes morally permissible for humans to kill non-human animals and utilize them as mere instrumentalities. This thesis is driven primarily by an in-depth analysis of the approaches to animal rights that are provided by Peter Singer, Tom Regan, and Gary Francione. Each of these thinkers claim that they overcome anthropocentrism and provide approaches that preclude the establishment of a moral hierarchy. One of the major findings of this thesis, however, is that Singer and Regan offer approaches that remain highly anthropocentric despite the fact that each thinker claims that they have overcome anthropocentrism. The anthropocentrism persists in these respective approaches in that each thinkers gives humans Regan and Singer have different conceptions of the criteria that are required to afford a being moral worth, but they both give preference to beings that have the cognitive ability to form desires regarding the future.. As a result, a moral hierarchy emerges in which humans are regarded to be fundamentally superior. Francione, however, provides an approach that does not foster a moral hierarchy. Francione creates such an approach by applying the principle of equal consideration of interests in a consistent manner. Moreover, Francione argues that mere sentience is both a necessary and sufficient condition for being eligible and subsequently receiving moral consideration. The upshot of this thesis is essentially that the moral treatment of animals is not compatible with the presence of a moral hierarchy. As a result, this thesis demonstrates that future approaches to animal rights must avoid the establishment of moral hierarchies. The research and analysis within this thesis demonstrates that this is not a possibility, however, unless all theories of justice that are to accommodate animals abandon the notion that cognition matters morally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Female candidates have become more successful in the political arena, specifically in the United States Senate. Today, females make up twenty percent of the total Senate seats. Despite this increase, females are still underrepresented in Washington. As such, understanding the roadblocks to equality will help us achieve parity. In an attempt to understand various challenges that female senatorial candidates face, this project looks at a specific element of their campaign, TV advertisements. Assessing candidate advertisements will help us understand whether gender affects strategic campaign decisions. Specifically, this project investigates the relationship between candidate gender and casting and setting of TV advertisements. Does gender influence the makeup of political ad spots? In order to understand this relationship more completely, I employ both quantitative data and case study analysis for same-gender and mixed-gender primary and general election contests in 2004 and 2008. Ultimately, candidate gender has little to no effect on casting of senatorial advertisements across both election cycles. Despite this variation in casting, we observe consistent findings across three settings, the political setting, the home setting, and the neighborhood setting. In both 2004 and 2008, female candidates use smaller proportions of ad frames with the political setting in comparison to their male counterparts. Female candidates in both election cycles also employed greater proportions of ad frames with the home and neighborhood setting compared to male candidates. These discrepancies point to a distinction in advertisement strategy depending on gender of the candidate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated how individuals retrospectively construe their lives in terms of major life events. Ninety-nine participants sorted a set of personal and historical events in terms of perceived importance for their lives. Analyses of variance with repeated measures and rank comparisons were computed. Overall findings revealed no cohort differences with regard to the perception of life events. However, within-cohort differences were found, indicating that more life events were recalled from the young adult years. Those experiences were also perceived as having been more important in the participants' lives than events from other age segments. With regard to historical events, war-related experiences were among the highest ranked. Analyses of variance revealed intracohort differences but not intercohort differences, indicating higher scores for the time between 1930 and 1948 relative to other historical periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study estimates the economic effects of a severance tax on the market for natural gas produced from shale sources using non-conventional extraction methods, such as horizontal drilling and fracking. Results suggest that a severance tax of 5% would increase the price of natural gas by as much as 3.82% and decrease gas extraction by an estimated 1.16% to a value of 9.52%. If applied to the Commonwealth of Pennsylvania in the United States, a 5% severance tax is estimated to raise between US$443 and $486 million per year in public revenue. The marginal deadweight loss associated with a 5% severance tax is estimated between 1.27% and 12.85% of the last dollar earned. The burden of this tax falls on both producers and consumers and depends upon the underlying assumptions made regarding the price responsiveness of consumers and producers. Under plausible assumptions, a family consuming 1000 MMcfs (approximate to 2.8 x 10(4) m(3)) per year of natural gas is estimated to pay an additional $100 per year after the implementation of a 5% severance tax.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project entailed a detailed study and analysis of the literary and musical text of Rimsky-Korsakov's opera The Golden Cockerel, involving source study, philological and musical-historical analysis, etc. Goryachikh studied the process of the creation of the opera, paying particular attention to its genre, that of a character fable, which was innovative for its time. He considered both the opera's folklore sources and the influences of the 'conditional theatre' aesthetics of the early 20th century. This culture-based approach made it possible to trace the numerous sources of the plot and its literary and musical text back to professional and folk cultures of Russia and other countries. A comparative study of the vocabulary, style and poetics of the libretto and the poetic system of Pushkin's Tale of the Golden Cockerel revealed much in common between the two. Goryachikh concluded that The Golden Cockerel was intended to be a specific form of 'dialogue' between the author, the preceding cultural tradition, and that of the time when the opera was written. He proposed a new definition of The Golden Cockerel as an 'inversed opera' and studied its structure and essence, its beginnings in the 'laughing culture' and the deflection of its forms and composition in a cultural language. He identified the constructive technique of Rimsky-Korsakov's writing at each level of musical unity and noted its influence on Stravinsky and Prokoviev, also finding anticipations of musical phenomena of the 20th century. He concluded by formulating a research model of Russian classical opera as cultural text and suggested further uses for it in musicology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vaccines with limited ability to prevent HIV infection may positively impact the HIV/AIDS pandemic by preventing secondary transmission and disease in vaccine recipients who become infected. To evaluate the impact of vaccination on secondary transmission and disease, efficacy trials assess vaccine effects on HIV viral load and other surrogate endpoints measured after infection. A standard test that compares the distribution of viral load between the infected subgroups of vaccine and placebo recipients does not assess a causal effect of vaccine, because the comparison groups are selected after randomization. To address this problem, we formulate clinically relevant causal estimands using the principal stratification framework developed by Frangakis and Rubin (2002), and propose a class of logistic selection bias models whose members identify the estimands. Given a selection model in the class, procedures are developed for testing and estimation of the causal effect of vaccination on viral load in the principal stratum of subjects who would be infected regardless of randomization assignment. We show how the procedures can be used for a sensitivity analysis that quantifies how the causal effect of vaccination varies with the presumed magnitude of selection bias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common goals in epidemiologic studies of infectious diseases include identification of the infectious agent, description of the modes of transmission and characterization of factors that influence the probability of transmission from infected to uninfected individuals. In the case of AIDS, the agent has been identified as the Human Immunodeficiency Virus (HIV), and transmission is known to occur through a variety of contact mechanisms including unprotected sexual intercourse, transfusion of infected blood products and sharing of needles in intravenous drug use. Relatively little is known about the probability of IV transmission associated with the various modes of contact, or the role that other cofactors play in promoting or suppressing transmission. Here, transmission probability refers to the probability that the virus is transmitted to a susceptible individual following exposure consisting of a series of potentially infectious contacts. The infectivity of HIV for a given route of transmission is defined to be the per contact probability of infection. Knowledge of infectivity and its relationship to other factors is important in understanding the dynamics of the AIDS epidemic and in suggesting appropriate measures to control its spread. The primary source of empirical data about infectivity comes from sexual partners of infected individuals. Partner studies consist of a series of such partnerships, usually heterosexual and monogamous, each composed of an initially infected "index case" and a partner who may or may not be infected by the time of data collection. However, because the infection times of both partners may be unknown and the history of contacts uncertain, any quantitative characterization of infectivity is extremely difficult. Thus, most statistical analyses of partner study data involve the simplifying assumption that infectivity is a constant common to all partnerships. The major objectives of this work are to describe and discuss the design and analysis of partner studies, providing a general statistical framework for investigations of infectivity and risk factors for HIV transmission. The development is largely based on three papers: Jewell and Shiboski (1990), Kim and Lagakos (1990), and Shiboski and Jewell (1992).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jewell and Kalbfleisch (1992) consider the use of marker processes for applications related to estimation of the survival distribution of time to failure. Marker processes were assumed to be stochastic processes that, at a given point in time, provide information about the current hazard and consequently on the remaining time to failure. Particular attention was paid to calculations based on a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. Specific applications to the analysis of AIDS data included the use of markers as surrogate responses for onset of AIDS with censored data and as predictors of the time elapsed since infection in prevalent individuals. Here we review recent work on the use of marker data to tackle these kinds of problems with AIDS data. The Poisson marker process with an additive model, introduced in Jewell and Kalbfleisch (1992) may be a useful "test" example for comparison of various procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose robust and e±cient tests and estimators for gene-environment/gene-drug interactions in family-based association studies. The methodology is designed for studies in which haplotypes, quantitative pheno- types and complex exposure/treatment variables are analyzed. Using causal inference methodology, we derive family-based association tests and estimators for the genetic main effects and the interactions. The tests and estimators are robust against population admixture and strati¯cation without requiring adjustment for confounding variables. We illustrate the practical relevance of our approach by an application to a COPD study. The data analysis suggests a gene-environment interaction between a SNP in the Serpine gene and smok- ing status/pack years of smoking that reduces the FEV1 volume by about 0.02 liter per pack year of smoking. Simulation studies show that the pro- posed methodology is su±ciently powered for realistic sample sizes and that it provides valid tests and effect size estimators in the presence of admixture and stratification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect.