970 resultados para Partial redundancy analysis (pRDA)
Resumo:
The full-scale base-isolated structure studied in this dissertation is the only base-isolated building in South Island of New Zealand. It sustained hundreds of earthquake ground motions from September 2010 and well into 2012. Several large earthquake responses were recorded in December 2011 by NEES@UCLA and by GeoNet recording station nearby Christchurch Women's Hospital. The primary focus of this dissertation is to advance the state-of-the art of the methods to evaluate performance of seismic-isolated structures and the effects of soil-structure interaction by developing new data processing methodologies to overcome current limitations and by implementing advanced numerical modeling in OpenSees for direct analysis of soil-structure interaction.
This dissertation presents a novel method for recovering force-displacement relations within the isolators of building structures with unknown nonlinearities from sparse seismic-response measurements of floor accelerations. The method requires only direct matrix calculations (factorizations and multiplications); no iterative trial-and-error methods are required. The method requires a mass matrix, or at least an estimate of the floor masses. A stiffness matrix may be used, but is not necessary. Essentially, the method operates on a matrix of incomplete measurements of floor accelerations. In the special case of complete floor measurements of systems with linear dynamics, real modes, and equal floor masses, the principal components of this matrix are the modal responses. In the more general case of partial measurements and nonlinear dynamics, the method extracts a number of linearly-dependent components from Hankel matrices of measured horizontal response accelerations, assembles these components row-wise and extracts principal components from the singular value decomposition of this large matrix of linearly-dependent components. These principal components are then interpolated between floors in a way that minimizes the curvature energy of the interpolation. This interpolation step can make use of a reduced-order stiffness matrix, a backward difference matrix or a central difference matrix. The measured and interpolated floor acceleration components at all floors are then assembled and multiplied by a mass matrix. The recovered in-service force-displacement relations are then incorporated into the OpenSees soil structure interaction model.
Numerical simulations of soil-structure interaction involving non-uniform soil behavior are conducted following the development of the complete soil-structure interaction model of Christchurch Women's Hospital in OpenSees. In these 2D OpenSees models, the superstructure is modeled as two-dimensional frames in short span and long span respectively. The lead rubber bearings are modeled as elastomeric bearing (Bouc Wen) elements. The soil underlying the concrete raft foundation is modeled with linear elastic plane strain quadrilateral element. The non-uniformity of the soil profile is incorporated by extraction and interpolation of shear wave velocity profile from the Canterbury Geotechnical Database. The validity of the complete two-dimensional soil-structure interaction OpenSees model for the hospital is checked by comparing the results of peak floor responses and force-displacement relations within the isolation system achieved from OpenSees simulations to the recorded measurements. General explanations and implications, supported by displacement drifts, floor acceleration and displacement responses, force-displacement relations are described to address the effects of soil-structure interaction.
Resumo:
A substantial amount of information on the Internet is present in the form of text. The value of this semi-structured and unstructured data has been widely acknowledged, with consequent scientific and commercial exploitation. The ever-increasing data production, however, pushes data analytic platforms to their limit. This thesis proposes techniques for more efficient textual big data analysis suitable for the Hadoop analytic platform. This research explores the direct processing of compressed textual data. The focus is on developing novel compression methods with a number of desirable properties to support text-based big data analysis in distributed environments. The novel contributions of this work include the following. Firstly, a Content-aware Partial Compression (CaPC) scheme is developed. CaPC makes a distinction between informational and functional content in which only the informational content is compressed. Thus, the compressed data is made transparent to existing software libraries which often rely on functional content to work. Secondly, a context-free bit-oriented compression scheme (Approximated Huffman Compression) based on the Huffman algorithm is developed. This uses a hybrid data structure that allows pattern searching in compressed data in linear time. Thirdly, several modern compression schemes have been extended so that the compressed data can be safely split with respect to logical data records in distributed file systems. Furthermore, an innovative two layer compression architecture is used, in which each compression layer is appropriate for the corresponding stage of data processing. Peripheral libraries are developed that seamlessly link the proposed compression schemes to existing analytic platforms and computational frameworks, and also make the use of the compressed data transparent to developers. The compression schemes have been evaluated for a number of standard MapReduce analysis tasks using a collection of real-world datasets. In comparison with existing solutions, they have shown substantial improvement in performance and significant reduction in system resource requirements.
Resumo:
Contemporary cnidarian-algae symbioses are challenged by increasing CO2 concentrations (ocean warming and acidification) affecting organisms' biological performance. We examined the natural variability of carbon and nitrogen isotopes in the symbiotic sea anemone Anemonia viridis to investigate dietary shifts (autotrophy/heterotrophy) along a natural pCO2 gradient at the island of Vulcano, Italy. delta 13C values for both algal symbionts (Symbiodinium) and host tissue of A. viridis became significantly lighter with increasing seawater pCO2. Together with a decrease in the difference between delta 13C values of both fractions at the higher pCO2 sites, these results indicate there is a greater net autotrophic input to the A. viridis carbon budget under high pCO2 conditions. delta 15N values and C/N ratios did not change in Symbiodinium and host tissue along the pCO2 gradient. Additional physiological parameters revealed anemone protein and Symbiodinium chlorophyll a remained unaltered among sites. Symbiodinium density was similar among sites yet their mitotic index increased in anemones under elevated pCO2. Overall, our findings show that A. viridis is characterized by a higher autotrophic/heterotrophic ratio as pCO2 increases. The unique trophic flexibility of this species may give it a competitive advantage and enable its potential acclimation and ecological success in the future under increased ocean acidification.
Resumo:
This work outlines the theoretical advantages of multivariate methods in biomechanical data, validates the proposed methods and outlines new clinical findings relating to knee osteoarthritis that were made possible by this approach. New techniques were based on existing multivariate approaches, Partial Least Squares (PLS) and Non-negative Matrix Factorization (NMF) and validated using existing data sets. The new techniques developed, PCA-PLS-LDA (Principal Component Analysis – Partial Least Squares – Linear Discriminant Analysis), PCA-PLS-MLR (Principal Component Analysis – Partial Least Squares –Multiple Linear Regression) and Waveform Similarity (based on NMF) were developed to address the challenging characteristics of biomechanical data, variability and correlation. As a result, these new structure-seeking technique revealed new clinical findings. The first new clinical finding relates to the relationship between pain, radiographic severity and mechanics. Simultaneous analysis of pain and radiographic severity outcomes, a first in biomechanics, revealed that the knee adduction moment’s relationship to radiographic features is mediated by pain in subjects with moderate osteoarthritis. The second clinical finding was quantifying the importance of neuromuscular patterns in brace effectiveness for patients with knee osteoarthritis. I found that brace effectiveness was more related to the patient’s unbraced neuromuscular patterns than it was to mechanics, and that these neuromuscular patterns were more complicated than simply increased overall muscle activity, as previously thought.
Resumo:
Aberrant behavior of biological signaling pathways has been implicated in diseases such as cancers. Therapies have been developed to target proteins in these networks in the hope of curing the illness or bringing about remission. However, identifying targets for drug inhibition that exhibit good therapeutic index has proven to be challenging since signaling pathways have a large number of components and many interconnections such as feedback, crosstalk, and divergence. Unfortunately, some characteristics of these pathways such as redundancy, feedback, and drug resistance reduce the efficacy of single drug target therapy and necessitate the employment of more than one drug to target multiple nodes in the system. However, choosing multiple targets with high therapeutic index poses more challenges since the combinatorial search space could be huge. To cope with the complexity of these systems, computational tools such as ordinary differential equations have been used to successfully model some of these pathways. Regrettably, for building these models, experimentally-measured initial concentrations of the components and rates of reactions are needed which are difficult to obtain, and in very large networks, they may not be available at the moment. Fortunately, there exist other modeling tools, though not as powerful as ordinary differential equations, which do not need the rates and initial conditions to model signaling pathways. Petri net and graph theory are among these tools. In this thesis, we introduce a methodology based on Petri net siphon analysis and graph network centrality measures for identifying prospective targets for single and multiple drug therapies. In this methodology, first, potential targets are identified in the Petri net model of a signaling pathway using siphon analysis. Then, the graph-theoretic centrality measures are employed to prioritize the candidate targets. Also, an algorithm is developed to check whether the candidate targets are able to disable the intended outputs in the graph model of the system or not. We implement structural and dynamical models of ErbB1-Ras-MAPK pathways and use them to assess and evaluate this methodology. The identified drug-targets, single and multiple, correspond to clinically relevant drugs. Overall, the results suggest that this methodology, using siphons and centrality measures, shows promise in identifying and ranking drugs. Since this methodology only uses the structural information of the signaling pathways and does not need initial conditions and dynamical rates, it can be utilized in larger networks.
Resumo:
Based on an original and comprehensive database of all feature fiction films produced in Mercosur between 2004 and 2012, the paper analyses whether the Mercosur film industry has evolved towards an integrated and culturally more diverse market. It provides a summary of policy opportunities in terms of integration and diversity, emphasizing the limiter role played by regional policies. It then shows that although the Mercosur film industry remains rather disintegrated, it tends to become more integrated and culturally more diverse. From a methodological point of view, the combination of Social Network Analysis and the Stirling Model opens up interesting research tracks to analyse creative industries in terms of their market integration and their cultural diversity.
Resumo:
Research on women’s employment has proliferated over recent decades, often under a perspective that conceptualizes female labour market activity as independent of male presences and absences in the productive and reproductive spheres. In the face of these approaches, the article argues the need to focus on the couple as the unit of analysis of work-life articulation. After referring to the main theoretical arguments that, from a gender perspective within labour studies, have pointed out the relevance of placing the household as the central space for the analysis of the sexual division of labour, the article reviews different empirical contributions that have incorporated such perspective in the international literature. Next, the state of the art in the Spanish literature is presented, before arguing the desirability of applying such framework of analysis to the study of employment and care work in Spanish households, which are at present undergoing major transformations.
Resumo:
In this article we analyze the Debate on the State of the Nation 2014. The methodology consists in coding the speeches of the prime minister, Mariano Rajoy (PP) and the then opposition leader Alfredo Perez Rubalcaba (PSOE) through extracting word clouds, branched maps and word trees that have shown the most common concepts and premises. This preliminary analysis of two dimensions, quantitative and qualitative, makes it much easier and viable subsequent discourse analysis where we focus on the different types of arguments in the communicative act: claim/solution, circumstantial premises, goal premises, value premises, meansgoal premises, alternative options/addressing alternative options.
Resumo:
Internet and the Web have changed the way that companies communicate with their publics, improving relations between them. Also providing substantial benefits for organizations. This has led to small and medium enterprises (SMEs) to develop corporate sites to establish relationships with their audiences. This paper, applying the methodology of content analysis, analyzes the main factors and tools that make the Websites usable and intuitive sites that promote better relations between SMEs and their audiences. Also, it has developed an index to measure the effectiveness of Webs from the perspective of usability. The results indicate that the Websites have, in general, appropriate levels of usability.
Resumo:
Advertising investment and audience figures indicate that television continues to lead as a mass advertising medium. However, its effectiveness is questioned due to problems such as zapping, saturation and audience fragmentation. This has favoured the development of non-conventional advertising formats. This study provides empirical evidence for the theoretical development. This investigation analyzes the recall generated by four non-conventional advertising formats in a real environment: short programme (branded content), television sponsorship, internal and external telepromotion versus the more conventional spot. The methodology employed has integrated secondary data with primary data from computer assisted telephone interviewing (CATI) were performed ad-hoc on a sample of 2000 individuals, aged 16 to 65, representative of the total television audience. Our findings show that non-conventional advertising formats are more effective at a cognitive level, as they generate higher levels of both unaided and aided recall, in all analyzed formats when compared to the spot.
Resumo:
Based on the concept of the triple basic structure of human communication by Poyatos (1994a, 1994b) and on the analytical and theoretical implications that derive from this, the present paper conceives the human communication as an indivisible whole in which verbal communication can not be separated from body behavior. This paper analyzes nonverbal categories used in oral communication. The corpus consists of an oral narration in Galician from which we highlighted certain kinemes (minimum units of body movement with meaning) by using the model proposed by Bouvet (2001), in order to explain the non-verbal categories with examples taken from said recordings.
Resumo:
In this paper, 36 English and 38 Spanish news articles were selected from English and Spanish newspapers and magazines published in the U.S.A. from August 2014 to November 2014. All articles discuss the death of Michael Brown, the ensuing protests and police investigations. A discourse analysis shows that there are few differences between reporting by the mainstream and the Hispanic media. Like the mainstream media, the Hispanic media adopts a neutral point of view with regard to the African-American minority. However, it presents a negative opinion with regard to the police. It appears that the Hispanic media does not explicitly side with the African-American community, but rather agrees more with the mainstream media’s opinion and is substantially influenced by it.
Resumo:
From ecological tourism to ecotourism: lexical analysis of an emerging tourism. This article deals with the lexicon created in connection with a recent form of tourism: the ecological tourism or ecotourism. The rise of this type of tourism encourages the creation of new concepts and products that are named with new words and expressions with different procedures of formation. From the name itself ecotourism, then expressed as the acronym ecotourism, we analyze the formation of other related words, as well as their formal variation and use. For this, we have worked with a specific corpus of electronic tourist texts and different digital sources and databases.
Resumo:
With the development of information technology, the theory and methodology of complex network has been introduced to the language research, which transforms the system of language in a complex networks composed of nodes and edges for the quantitative analysis about the language structure. The development of dependency grammar provides theoretical support for the construction of a treebank corpus, making possible a statistic analysis of complex networks. This paper introduces the theory and methodology of the complex network and builds dependency syntactic networks based on the treebank of speeches from the EEE-4 oral test. According to the analysis of the overall characteristics of the networks, including the number of edges, the number of the nodes, the average degree, the average path length, the network centrality and the degree distribution, it aims to find in the networks potential difference and similarity between various grades of speaking performance. Through clustering analysis, this research intends to prove the network parameters’ discriminating feature and provide potential reference for scoring speaking performance.