842 resultados para consistency in indexing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study presents a new inventory to assess thought-action fusion (TAF). 160 college students ages 18 to 22 (M = 19.17, SD = 1.11) completed the new Modified Thought Action Scale (MTAFS). Results indicated high internal consistency in the MTAFS (Cronbach’s α = .95). A principal component analysis suggested a three factor solution of TAF-Moral (TAFM), TAFLikelihood (TAFL), and TAF-Harm avoidance-Positive (TAFHP) all with eigenvalues above 1, and factor loadings above .4. A second study examined the association between TAF, obsessivecompulsive and anxiety tendencies after the activation of TAF-like thought processes in a nonclinical sample (n=76). Subjects were randomly assigned to one of three treatment groups intended to provoke TAFL-self, TAFL-other, and TAF moral thought processes. Stepwise regression analyses revealed: 1) the Obsessive-Compulsive Inventory subscales Neutralizing and Ordering significantly predicted instructed neutralization behavior (INB) in non-clinical participants; 2) TAF-Likelihood contributed significant unique variance in INB. These findings suggest that the provocation of neutralization behavior may be mediated by specific subsets of TAF and obsessive-compulsive tendencies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Several approaches can be used to determine the order of loci on chromosomes and hence develop maps of the genome. However, all mapping approaches are prone to errors either arising from technical deficiencies or lack of statistical support to distinguish between alternative orders of loci. The accuracy of the genome maps could be improved, in principle, if information from different sources was combined to produce integrated maps. The publicly available bovine genomic sequence assembly with 6x coverage (Btau_2.0) is based on whole genome shotgun sequence data and limited mapping data however, it is recognised that this assembly is a draft that contains errors. Correcting the sequence assembly requires extensive additional mapping information to improve the reliability of the ordering of sequence scaffolds on chromosomes. The radiation hybrid (RH) map described here has been contributed to the international sequencing project to aid this process. RESULTS: An RH map for the 30 bovine chromosomes is presented. The map was built using the Roslin 3000-rad RH panel (BovGen RH map) and contains 3966 markers including 2473 new loci in addition to 262 amplified fragment-length polymorphisms (AFLP) and 1231 markers previously published with the first generation RH map. Sequences of the mapped loci were aligned with published bovine genome maps to identify inconsistencies. In addition to differences in the order of loci, several cases were observed where the chromosomal assignment of loci differed between maps. All the chromosome maps were aligned with the current 6x bovine assembly (Btau_2.0) and 2898 loci were unambiguously located in the bovine sequence. The order of loci on the RH map for BTA 5, 7, 16, 22, 25 and 29 differed substantially from the assembled bovine sequence. From the 2898 loci unambiguously identified in the bovine sequence assembly, 131 mapped to different chromosomes in the BovGen RH map. CONCLUSION: Alignment of the BovGen RH map with other published RH and genetic maps showed higher consistency in marker order and chromosome assignment than with the current 6x sequence assembly. This suggests that the bovine sequence assembly could be significantly improved by incorporating additional independent mapping information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The reliable quantification of gene copy number variations is a precondition for future investigations regarding their functional relevance. To date, there is no generally accepted gold standard method for copy number quantification, and methods in current use have given inconsistent results in selected cohorts. In this study, we compare two methods for copy number quantification. beta-defensin gene copy numbers were determined in parallel in 80 genomic DNA samples by real-time PCR and multiplex ligation-dependent probe amplification (MLPA). The pyrosequencing-based paralog ratio test (PPRT) was used as a standard of comparison in 79 out of 80 samples. Realtime PCR and MPLA results confirmed concordant DEFB4, DEFB103A, and DEFB104A copy numbers within samples. These two methods showed identical results in 32 out of 80 samples; 29 of these 32 samples comprised four or fewer copies. The coefficient of variation of MLPA is lower compared with PCR. In addition, the consistency between MLPA and PPRT is higher than either PCR/MLPA or PCR/PPRT consistency. In summary, these results suggest that MLPA is superior to real-time PCR in beta-defensin copy number quantification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES The aim of the current Valve Academic Research Consortium (VARC)-2 initiative was to revisit the selection and definitions of transcatheter aortic valve implantation (TAVI) clinical endpoints to make them more suitable to the present and future needs of clinical trials. In addition, this document is intended to expand the understanding of patient risk stratification and case selection. BACKGROUND A recent study confirmed that VARC definitions have already been incorporated into clinical and research practice and represent a new standard for consistency in reporting clinical outcomes of patients with symptomatic severe aortic stenosis (AS) undergoing TAVI. However, as the clinical experience with this technology has matured and expanded, certain definitions have become unsuitable or ambiguous. METHODS AND RESULTS Two in-person meetings (held in September 2011 in Washington, DC, and in February 2012 in Rotterdam, The Netherlands) involving VARC study group members, independent experts (including surgeons, interventional and noninterventional cardiologists, imaging specialists, neurologists, geriatric specialists, and clinical trialists), the US Food and Drug Administration (FDA), and industry representatives, provided much of the substantive discussion from which this VARC-2 consensus manuscript was derived. This document provides an overview of risk assessment and patient stratification that need to be considered for accurate patient inclusion in studies. Working groups were assigned to define the following clinical endpoints: mortality, stroke, myocardial infarction, bleeding complications, acute kidney injury, vascular complications, conduction disturbances and arrhythmias, and a miscellaneous category including relevant complications not previously categorized. Furthermore, comprehensive echocardiographic recommendations are provided for the evaluation of prosthetic valve (dys)function. Definitions for the quality of life assessments are also reported. These endpoints formed the basis for several recommended composite endpoints. CONCLUSIONS This VARC-2 document has provided further standardization of endpoint definitions for studies evaluating the use of TAVI, which will lead to improved comparability and interpretability of the study results, supplying an increasingly growing body of evidence with respect to TAVI and/or surgical aortic valve replacement. This initiative and document can furthermore be used as a model during current endeavors of applying definitions to other transcatheter valve therapies (for example, mitral valve repair).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent findings related to childhood leukaemia incidence near nuclear installations have raised questions which can be answered neither by current knowledge on radiation risk nor by other established risk factors. In 2012, a workshop was organised on this topic with two objectives: (a) review of results and discussion of methodological limitations of studies near nuclear installations; (b) identification of directions for future research into the causes and pathogenesis of childhood leukaemia. The workshop gathered 42 participants from different disciplines, extending widely outside of the radiation protection field. Regarding the proximity of nuclear installations, the need for continuous surveillance of childhood leukaemia incidence was highlighted, including a better characterisation of the local population. The creation of collaborative working groups was recommended for consistency in methodologies and the possibility of combining data for future analyses. Regarding the causes of childhood leukaemia, major fields of research were discussed (environmental risk factors, genetics, infections, immunity, stem cells, experimental research). The need for multidisciplinary collaboration in developing research activities was underlined, including the prevalence of potential predisposition markers and investigating further the infectious aetiology hypothesis. Animal studies and genetic/epigenetic approaches appear of great interest. Routes for future research were pointed out.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hybrid zones are regions where individuals from genetically differentiated populations meet and mate, resulting in at least some offspring of mixed ancestry. Patterns of gene flow (introgression) in hybrid zones vary across the genome, allowing assessment of the role of individual genes or genome regions in reproductive isolation. Here, we document patterns of introgression between two recently diverged species of field crickets. We sampled at a very fine spatial scale and genotyped crickets for 110 highly differentiated single nucleotide polymorphisms (SNPs) identified through transcriptome scans. Using both genomic and geographic cline analysis, we document remarkably abrupt transitions (<100 m) in allele frequencies for 50 loci, despite high levels of gene flow at other loci. These are among the steepest clines documented for any hybridizing taxa. Furthermore, the cricket hybrid zone provides one of the clearest examples of the semi-permeability of species boundaries. Comparisons between data from the fine-scale transect and data (for the same set of markers) from sampling a much larger area in a different region of the cricket hybrid zone reveal consistent patterns of introgression for individual loci. The consistency in patterns of introgression between these two distant and distinct regions of the hybrid zone suggests that strong selection is acting to maintain abrupt discontinuities within the hybrid zone and that genomic regions with restricted introgression likely include genes that contribute to nonecological prezygotic barriers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Time series of satellite measurements are used to describe patterns of surface temperature and chlorophyll associated with the 1996 cold La Nina phase and the 1997-1998 warm El Nino phase of the El Nino - Southern Oscillation cycle in the upwelling region off northern Chile. Surface temperature data are available through the entire study period. Sea-viewing Wide Field-of-view Sensor (SeaWiFS) data first became available in September 1997 during a relaxation in El Nino conditions identified by in situ hydrographic data. Over the time period of coincident satellite data, chlorophyll patterns closely track surface temperature patterns. Increases both in nearshore chlorophyll concentration and in cross-shelf extension of elevated concentrations are associated with decreased coastal temperatures during both the relaxation in El Nino conditions in September-November 1997 and the recovery from EI Nino conditions after March 1998. Between these two periods during austral summer (December 1997 to March 1998) and maximum El Nino temperature anomalies, temperature patterns normally associated with upwelling were absent and chlorophyll concentrations were minimal. Cross-shelf chlorophyll distributions appear to be modulated by surface temperature frontal zones and are positively correlated with a satellite-derived upwelling index. Frontal zone patterns and the upwelling index in 1996 imply an austral summer nearshore chlorophyll maximum, consistent with SeaWiFS data from I 1998-1999, after the El Nino. SeaWiFS retrievals in the data set used here are higher than in situ measurements by a factor of 2-4; however, consistency in the offset suggests relative patterns are valid.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. Cardiovascular disease (CVD) exhibits the most striking public health significance due to its high prevalence and mortality as well as huge economic burdens all over the world, especially in industrialized countries. Major risk factors of CVDs have been the targets of population-wide prevention in the United States. Economic evaluations provide structured information in regard to the efficiency of resource utilization which can inform decisions of resource allocation. The main purpose of this review is to investigate the pattern of study design of economic evaluations for interventions of CVDs. ^ Methods. Primary journal articles published during 2003-2008 were systematically retrieved via relevant keywords from Medline, NHS Economic Evaluation Database (NHS EED) and EBSCO Academic Search Complete. Only full economic evaluations for narrowly defined CVD interventions were included for this review. The methodological data of interest were extracted from the eligible articles and reorganized in Microsoft Access database. Chi-square tests in SPSS were used to analyze the associations between pairs of categorical data. ^ Results. One hundred and twenty eligible articles were reviewed after two steps of literature selection with explicit inclusion and exclusion criteria. Descriptive statistics were reported regarding the evaluated interventions, outcome measures, unit costing and cost reports. The chi-square test of the association between prevention level of intervention and category of time horizon showed no statistical significance. The chi-square test showed that sponsor type was significantly associated with whether new or standard intervention being concluded as more cost effective. ^ Conclusions. Tertiary prevention and medication interventions are the major interests for economic evaluators. The majority of the evaluations were claimed from either a provider’s or a payer’s perspective. Almost all evaluations adopted gross costing strategy for unit cost data rather than micro costing. EQ-5D is the most commonly used instrument for subjective outcome measurement. More than half of the evaluations used decision analytic modeling techniques. The lack of consistency in study design standards in published evaluations appears in several aspects. Prevention level of intervention is not likely to be a factor for evaluators to decide whether to design an evaluation in a lifetime horizon or not. Published evaluations sponsored by industry are more likely to conclude that new intervention is more cost effective than standard intervention.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Measurements of atmospheric radioactivity attached to aerosols are described. Fallout was collected in a vessel of large area. Emphasis was on separation of "wet" and "dry" samples. For strontium 90 a ratio of "wet" to "dry" fallout of 5:1 has been found independent of latitude. The total fallout was smaller than comparable values from continents because of very small amounts of rainfall in the equatorial zone. In order to achieve consistency in the global balance a better knowledge not only of radioactivity but also of precipitation over the ocean is required. Fallout of Ra-D clearly shows the ITC as a barrier for the latitudinal movement of near sea-surface air masses. The concentration of short-lived emanation daughters shows large variations according to varying geographic conditions. A variation with time could not be explained. The specific activity of long-lived radioactive substances shows the expected effect of the ITC as well as a seasonal diminuation of average concentration, similar to that measured at Heidelberg.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ocean Drilling Program Legs 127 and 128 in the Japan Sea have revealed the existence of numerous dark-light rhythms of remarkable consistency in sediments of late Miocene, latest Pliocene, and especially Pleistocene age. Light-colored units within these rhythms are massive or bioturbated, consist of diatomaceous clays, silty clays, or nannofossil-rich clays, and are generally poor in organic matter. Dark-colored units are homogeneous, laminated, or thinly bedded and include substantial amounts of biogenic material such as well-preserved diatoms, planktonic foraminifers, calcareous nannofossils, and organic matter (maximum 7.4 wt%). The dark-light rhythms show a similar geometrical pattern on three different scales: First-order rhythms consist of a cluster dominated by dark-colored units followed by a cluster dominated by light-colored units (3-5 m). Spectral analysis of a gray-value time series suggests that the frequencies of the first-order rhythms in sediments of latest Pliocene and Pleistocene age correlate to the obliquity and the eccentricity cycles. The second-order dark-light rhythms include a light and a dark-colored unit (10-160 cm). They were formed in time spans of several hundred to several ten thousand years, with variance centering around 10,500 yr. This frequency may correspond to half the precessional cycle. Third-order rhythms appear as laminated or thinly bedded dark-light couplets (2-15 mm) within the dark-colored units of the second-order rhythms and may represent annual frequencies. In interpreting the rhythms, we have to take into account that (1) the occurrence of the first- and second-order rhythms is not necessarily restricted to glacial or interglacial periods as is shown by preliminary stable-isotope analysis and comparison with the published d18O record; (2) they appear to be Milankovitch-controlled; and (3) a significant number of the rhythms are sharply bounded. The origin of the dark-light rhythms is probably related to variations in monsoonal activity in the Japan Sea, which show annual frequencies, but also operates in phase with the orbital cycles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La consistencia del trazado puede interpretarse como la relación entre las características geométricas de una carretera y lo que espera encontrar el conductor que circula por ella. Si hay una correspondencia entre estos dos aspectos, la conducción puede hacerse de modo continuo, sin sobresaltos, lo que incide favorablemente sobre la seguridad en la circulación. Si bien hay una serie de recomendaciones desde el punto de vista geométrico para obtener trazados consistentes, esto no siempre se logra, y sólo en los últimos años se ha iniciado el estudio de metodologías para evaluar ésto, tanto en vías existentes como en vías proyectadas. La mayor parte de estas metodologías sólo considera el trazado en planta, olvidándose del trazado en alzado y de la coordinación entre los mismos. En esta Tesis doctoral se ha desarrollado una metodología para evaluar la consistencia del trazado en carreteras interurbanas de dos carriles que considera dichos aspectos. Para ello, se hizo un análisis exhaustivo de los índices de trazado, los cuales evalúan las características geométricas en planta y en alzado. Los índices se correlacionaron con la accidentalidad, para determinar cuál de ellos tiene mayor incidencia, encontrándose que es el cambio de curvatura vertical (VCCR); a este índice se le estableció un rango de calificación. Como elemento de evaluación complementario de análisis se seleccionó el perfil de velocidades de operación, procedimiento que ha sido probado en diferentes investigaciones, y del cual se desarrolló un modelo aplicado a Colombia. Para la coordinación de trazados en planta y alzado se evaluaron diferentes combinaciones geométricas, algunas de las cuales generaron reapariciones del trazado. Se ha definido un nuevo índice (Irt) que permite determinar numéricamente la posibilidad de que se presente esta situación, indeseable desde el punto de vista de la seguridad vial. La combinación de estos tres elementos permite una evaluación integral de los diferentes aspectos que inciden sobre la consistencia del trazado de una carretera. La metodología desarrollada se aplicó en el estudio de consistencia del trazado en algunas carreteras españolas y colombianas, ubicadas en distintos tipos de terreno. ABSTRACT Geometric Design Consistency can be defined as the relationship between the geometric characteristics of a road and what the driver expects to find when driving. If there is a correspondence between these two aspects, driving is smoother and unexpected events are minimized, which increases traffic safety conditions. Although from the geometric point of view there are several recommendations to ensure consistent designs, this is not always successfully applied. The study of methods to evaluate design consistency in existing and future routes has only begun in recent years. Most existing methods only consider the horizontal alignment of the road and overlook both the vertical alignment and the coordination that must exist between the vertical and the horizontal. The present Doctoral Thesis proposes a method to evaluate the geometric design consistency of a two-lane rural highway which considers all three of these aspects: the horizontal alignment, the vertical alignment and the coordination that must exist between them. In order to achieve this, several different alignment indices, that evaluate horizontal and vertical geometric characteristics, were thoroughly analyzed to determine their correlation with traffic accidents. The Vertical Curvature Change Rate (VCCR) index showed the highest correlation, and rating thresholds for this index have been established. To complement the evaluation, the operating speed profile, was chosen. This procedure has been extensively tested by several researchers. An operating speed prediction model adapted to Colombia was developed. To study the coordination between the horizontal and the vertical alignments of the road, several geometric combinations of the two were used. Some of these combinations generate undesirable losses of visibility. For this reason, a new index (Irt) was defined to numerically detect those cases, which are undesirable from the point of view of traffic safety. The combination of these three factors allows a comprehensive evaluation of the different aspects that affect the geometric design consistency of a highway. The methodology was applied to some Spanish and Colombian roads located in different types of terrain.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the most used methods in rapidprototyping is Fused Deposition Modeling (FDM), which provides components with a reasonable strength in plastic materials such as ABS and has a low environmental impact. However, the FDM process exhibits low levels of surface finishing, difficulty in getting complex and/or small geometries and low consistency in “slim” elements of the parts. Furthermore, “cantilever” elements need large material structures to be supported. The solution of these deficiencies requires a comprehensive review of the three-dimensional part design to enhance advantages and performances of FDM and reduce their constraints. As a key feature of this redesign a novel method of construction by assembling parts with structuraladhesive joints is proposed. These adhesive joints should be designed specifically to fit the plastic substrate and the FDM manufacturing technology. To achieve this, the most suitable structuraladhesiveselection is firstly required. Therefore, the present work analyzes five different families of adhesives (cyanoacrylate, polyurethane, epoxy, acrylic and silicone), and, by means of the application of technical multi-criteria decision analysis based on the analytic hierarchy process (AHP), to select the structuraladhesive that better conjugates mechanical benefits and adaptation to the FDM manufacturing process

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tesis está enmarcada en el estudio de diferentes procedimientos numéricos para resolver la dinámica de un sistema multicuerpo sometido a restricciones e impacto, que puede estar compuesto por sólidos rígidos y deformables conectados entre sí por diversos tipos de uniones. Dentro de los métodos numéricos analizados se presta un especial interés a los métodos consistentes, los cuales tienen por objetivo que la energía calculada en cada paso de tiempo, para un sistema mecánico, tenga una evolución coherente con el comportamiento teórico de la energía. En otras palabras, un método consistente mantiene constante la energía total en un problema conservativo, y en presencia de fuerzas disipativas proporciona un decremento positivo de la energía total. En esta línea se desarrolla un algoritmo numérico consistente con la energía total para resolver las ecuaciones de la dinámica de un sistema multicuerpo. Como parte de este algoritmo se formulan energéticamente consistentes las restricciones y el contacto empleando multiplicadores de Lagrange, penalización y Lagrange aumentado. Se propone también un método para el contacto con sólidos rígidos representados mediante superficies implícitas, basado en una restricción regularizada que se adaptada adecuadamente para el cumplimiento exacto de la restricción de contacto y para ser consistente con la conservación de la energía total. En este contexto se estudian dos enfoques: uno para el contacto elástico puro (sin deformación) formulado con penalización y Lagrange aumentado; y otro basado en un modelo constitutivo para el contacto con penetración. En el segundo enfoque se usa un potencial de penalización que, en ausencia de componentes disipativas, restaura la energía almacenada en el contacto y disipa energía de forma consistente con el modelo continuo cuando las componentes de amortiguamiento y fricción son consideradas. This thesis focuses on the study of several numerical procedures used to solve the dynamics of a multibody system subjected to constraints and impact. The system may be composed by rigid and deformable bodies connected by different types of joints. Within this framework, special attention is paid to consistent methods, which preserve the theoretical behavior of the energy at each time step. In other words, a consistent method keeps the total energy constant in a conservative problem, and provides a positive decrease in the total energy when dissipative forces are present. A numerical algorithm has been developed for solving the dynamical equations of multibody systems, which is energetically consistent. Energetic consistency in contacts and constraints is formulated using Lagrange multipliers, penalty and augmented Lagrange methods. A contact methodology is proposed for rigid bodies with a boundary represented by implicit surfaces. The method is based on a suitable regularized constraint formulation, adapted both to fulfill exactly the contact constraint, and to be consistent with the conservation of the total energy. In this context two different approaches are studied: the first applied to pure elastic contact (without deformation), formulated with penalty and augmented Lagrange; and a second one based on a constitutive model for contact with penetration. In this second approach, a penalty potential is used in the constitutive model, that restores the energy stored in the contact when no dissipative effects are present. On the other hand, the energy is dissipated consistently with the continuous model when friction and damping are considered.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El objetivo principal de este trabajo de investigación es estudiar las posibilidades de utilización del árido reciclado mixto para un hormigón reciclado en aplicaciones no estructurales, justificando mediante la experimentación la validez para esta aplicación, tanto del árido reciclado como del hormigón reciclado. Esta tesis se centró en los aspectos más restrictivos y limitativos en la utilización de los áridos mixtos en hormigón reciclado, basándose tanto en la normativa internacional existente como en los resultados obtenidos en los estudios bibliográficos consultados. La primera tarea realizada fue la caracterización completa de las propiedades del árido reciclado mixto, recogiendo especialmente los siguientes aspectos: granulometría, contenido de finos, absorción y densidades, composición del árido reciclado, índice de lajas, coeficiente de Los Ángeles, partículas ligeras y contenido de sulfatos. De este estudio de los áridos reciclados, se han destacado relaciones entre las propiedades. Las diferentes correlaciones permiten proponer criterios de calidad de un árido reciclado mixto para un hormigón reciclado. Se ha elegido un árido reciclado mixto entre los estudiados, de características límite admisibles, para obtener resultados conservadores sobre el hormigón reciclado fabricado con él. En una segunda etapa, se ha realizado un estudio de dosificación completo del hormigón reciclado, evaluando la consistencia del hormigón en estado fresco y la resistencia a compresión del hormigón en estado endurecido y se ha comparado con las mismas propiedades de un hormigón convencional. Se ha analizado la capacidad de absorción del árido conseguida con los métodos de presaturación empleados y en función de su estado de humedad, para poder evaluar las relaciones agua/cemento totales y efectivas del hormigón. Se ha estudiado el efecto de estos dos parámetros tanto en la consistencia como en la resistencia del hormigón reciclado. Finalmente, se ha estudiado el hormigón fabricado con un 50% y 100% de una partida de árido reciclado mixto de calidad admisible y se han ensayado las siguientes propiedades: consistencia, resistencia a compresión, resistencia a tracción indirecta, módulo de elasticidad dinámico, cambios de longitud, porosidad abierta y microscopía. Para analizar el efecto de los sulfatos, se han añadido artificialmente cantidades de yeso controladas en el hormigón reciclado. Se fabricaron hormigones con dos tipos de cemento, un cemento CEM I 42,5 R con elevado contenido de C3A, que debería dar lugar a expansiones mayores y un cemento con adiciones puzolánicas CEM II A-P 42,5 R, que atenuaría el comportamiento expansivo en el hormigón. Los resultados finales indican que la utilización del árido reciclado mixto en proporciones de hasta un 50%, permiten cubrir la gama de resistencias más exigentes dentro del hormigón no estructural. El contenido de sulfatos puede variar desde un 0,8% hasta un 1,9%, según el tipo de cemento y la proporción de sustitución del árido natural por árido reciclado mixto. Tanto en el caso del árido reciclado como en el hormigón, se ha realizado un estudio comparativo entre el conjunto de datos recopilados en la bibliografía y los obtenidos en este estudio experimental. En varias propiedades del hormigón reciclado, se han comparado los resultados con las fórmulas de la Instrucción EHE-08, para establecer unos coeficientes de corrección a aplicar a un hormigón reciclado con fines no estructurales. The main objective of this investigation work is to study the possibilities of using recycled mixed aggregate for a recycled concrete in non structural applications, justifying by means of experimentation both the validity of the recycled aggregate and recycled concrete. This thesis focused on the most restrictive and limiting aspects in the mixed aggregate use in recycled concrete, on the basis of the international standards as well on the results obtained in the bibliographic studies consulted. The first task achieved was the complete charcaterization of the mixed recycled aggregate properties, specially the following aspects: grain size analysis, fines content, absorption and densities, recycled aggregate composition, flakiness index, Los Angeles coefficient, lightweight particles and sulphate content. From this study, correlations between the properties were highlighted. The different correlations make possible to propose quality criterions for recycled mixed aggregate in concrete. Among the recycled aggregates studied, one of acceptable characteristics but near the limits established, was chosen to obtain conservative results in the recycled concrete made with it. In a second step, a complete recycled concrete mix design was made, to evaluate concrete consistency in the fresh state and concrete compressive strength in the hardened state and its properties were compared to those of a control concrete. The aggregate absorption capacity was analized with the presaturation methods achieved and in function of its state of humidity, to evaluate the total and effective water/cement ratios. The effect of these two parameters, both in consistency and compressive strength of recycled concrete, was studied. Finally, the concrete made with 50% and 100% of the elected recycled mixed aggregate was studied and the following concrete properties were tested: consistency, compressive strength, tensile strength, dynamic modulus of elasticity, length changes, water absorption under vacuum and microscopy. To analize the effect of sulphate content, some controlled quantities of gypsum were artificially added to the recycled concrete. Concretes with two types of cement were made, a cement CEM I 42,5 R with a high content of C3A, that would lead to major expansions and a cement with puzzolanic additions CEM II A-P 42,5 R that would lower the expansive behaviour of concrete. The final results indicate that the use of mixed recycled aggregate in proportions up to 50% make possible to cover the overall demanding strengths within the non structural concrete. Sulphates content can range between 0,8% and 1,9%, in function of the type of cement and the proportion of natural aggregate replacement by mixed recycled one. Both in the case of recycled aggregate and concrete, a comparative study was made between the data coming from the bibliography and those obtained in the experimental study. In several recycled concrete properties, the results were compared to the formulas of Spanish Instruction of Structural Concrete (Instruction EHE-08), to establish some correction coefficients to apply for a non structural recycled concrete.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El primer procesamiento estricto realizado con el software científico Bernese y contemplando las más estrictas normas de cálculo recomendadas internacionalmente, permitió obtener un campo puntual de alta exactitud, basado en la integración y estandarización de los datos de una red GPS ubicada en Costa Rica. Este procesamiento contempló un total de 119 semanas de datos diarios, es decir unos 2,3 años, desde enero del año 2009 hasta abril del año 2011, para un total de 30 estaciones GPS, de las cuales 22 están ubicadas en el territorio nacional de Costa Rica y 8 internaciones pertenecientes a la red del Sistema Geocéntrico para las Américas (SIRGAS). Las denominadas soluciones semilibres generaron, semana a semana, una red GPS con una alta exactitud interna definida por medio de los vectores entre las estaciones y las coordenadas finales de la constelación satelital. La evaluación semanal dada por la repetibilidad de las soluciones brindó en promedio errores de 1,7 mm, 1,4 mm y 5,1 mm en las componentes [n e u], confirmando una alta consistencia en estas soluciones. Aunque las soluciones semilibres poseen una alta exactitud interna, las mismas no son utilizables para fines de análisis cinemático, pues carecen de un marco de referencia. En Latinoamérica, la densificación del Marco Internacional Terrestre de Referencia (ITRF), está representado por la red de estaciones de operación continua GNSS de SIRGAS, denominada como SIRGAS-CON. Por medio de las denominadas coordenadas semanales finales de las 8 estaciones consideradas como vínculo, se refirió cada una de las 119 soluciones al marco SIRGAS. La introducción del marco de referencia SIRGAS a las soluciones semilibres produce deformaciones en estas soluciones. Las deformaciones de las soluciones semilibres son producto de las cinemática de cada una de las placas en las que se ubican las estaciones de vínculo. Luego de efectuado el amarre semanal a las coordenadas SIRGAS, se hizo una estimación de los vectores de velocidad de cada una de las estaciones, incluyendo las de amarre, cuyos valores de velocidad se conocen con una alta exactitud. Para la determinación de las velocidades de las estaciones costarricenses, se programó una rutina en ambiente MatLab, basada en una ajuste por mínimos cuadrados. Los valores obtenidos en el marco de este proyecto en comparación con los valores oficiales, brindaron diferencias promedio del orden de los 0,06 cm/a, -0,08 cm/a y -0,10 cm/a respectivamente para las coordenadas [X Y Z]. De esta manera se logró determinar las coordenadas geocéntricas [X Y Z]T y sus variaciones temporales [vX vY vZ]T para el conjunto de 22 estaciones GPS de Costa Rica, dentro del datum IGS05, época de referencia 2010,5. Aunque se logró una alta exactitud en los vectores de coordenadas geocéntricas de las 22 estaciones, para algunas de las estaciones el cálculo de las velocidades no fue representativo debido al relativo corto tiempo (menos de un año) de archivos de datos. Bajo esta premisa, se excluyeron las ocho estaciones ubicadas al sur de país. Esto implicó hacer una estimación del campo local de velocidades con solamente veinte estaciones nacionales más tres estaciones en Panamá y una en Nicaragua. El algoritmo usado fue el denominado Colocación por Mínimos Cuadrados, el cual permite la estimación o interpolación de datos a partir de datos efectivamente conocidos, el cual fue programado mediante una rutina en ambiente MatLab. El campo resultante se estimó con una resolución de 30' X 30' y es altamente constante, con una velocidad resultante promedio de 2,58 cm/a en una dirección de 40,8° en dirección noreste. Este campo fue validado con base en los datos del modelo VEMOS2009, recomendado por SIRGAS. Las diferencias de velocidad promedio para las estaciones usadas como insumo para el cálculo del campo fueron del orden los +0,63 cm/a y +0,22 cm/a para los valores de velocidad en latitud y longitud, lo que supone una buena determinación de los valores de velocidad y de la estimación de la función de covarianza empírica, necesaria para la aplicación del método de colocación. Además, la grilla usada como base para la interpolación brindó diferencias del orden de -0,62 cm/a y -0,12 cm/a para latitud y longitud. Adicionalmente los resultados de este trabajo fueron usados como insumo para hacer una aproximación en la definición del límite del llamado Bloque de Panamá dentro del territorio nacional de Costa Rica. El cálculo de las componentes del Polo de Euler por medio de una rutina programa en ambiente MatLab y aplicado a diferentes combinaciones de puntos no brindó mayores aportes a la definición física de este límite. La estrategia lo que confirmó fue simplemente la diferencia en la dirección de todos los vectores velocidad y no permitió reveló revelar con mayor detalle una ubicación de esta zona dentro del territorio nacional de Costa Rica. ABSTRACT The first strict processing performed with the Bernese scientific software and contemplating the highest standards internationally recommended calculation, yielded a precise field of high accuracy, based on the integration and standardization of data from a GPS network located in Costa Rica. This processing watched a total of 119 weeks of daily data, is about 2.3 years from January 2009 to April 2011, for a total of 30 GPS stations, of which 22 are located in the country of Costa Rica and 8 hospitalizations within the network of Geocentric System for the Americas (SIRGAS). The semi-free solutions generated, every week a GPS network with high internal accuracy defined by vectors between stations and the final coordinates of the satellite constellation. The weekly evaluation given by repeatability of the solutions provided in average errors of 1.7 mm 1.4 mm and 5.1 mm in the components [n e u], confirming a high consistency in these solutions. Although semi-free solutions have a high internal accuracy, they are not used for purposes of kinematic analysis, because they lack a reference frame. In Latin America, the densification of the International Terrestrial Reference Frame (ITRF), is represented by a network of continuously operating GNSS stations SIRGAS, known as SIRGAS-CON. Through weekly final coordinates of the 8 stations considered as a link, described each of the solutions to the frame 119 SIRGAS. The introduction of the frame SIRGAS to semi-free solutions generates deformations. The deformations of the semi-free solutions are products of the kinematics of each of the plates in which link stations are located. After SIRGAS weekly link to SIRGAS frame, an estimate of the velocity vectors of each of the stations was done. The velocity vectors for each SIRGAS stations are known with high accuracy. For this calculation routine in MatLab environment, based on a least squares fit was scheduled. The values obtained compared to the official values, gave average differences of the order of 0.06 cm/yr, -0.08 cm/yr and -0.10 cm/yr respectively for the coordinates [XYZ]. Thus was possible to determine the geocentric coordinates [XYZ]T and its temporal variations [vX vY vZ]T for the set of 22 GPS stations of Costa Rica, within IGS05 datum, reference epoch 2010.5. The high accuracy vector for geocentric coordinates was obtained, however for some stations the velocity vectors was not representative because of the relatively short time (less than one year) of data files. Under this premise, the eight stations located in the south of the country were excluded. This involved an estimate of the local velocity field with only twenty national stations plus three stations in Panama and Nicaragua. The algorithm used was Least Squares Collocation, which allows the estimation and interpolation of data from known data effectively. The algorithm was programmed with MatLab. The resulting field was estimated with a resolution of 30' X 30' and is highly consistent with a resulting average speed of 2.58 cm/y in a direction of 40.8° to the northeast. This field was validated based on the model data VEMOS2009 recommended by SIRGAS. The differences in average velocity for the stations used as input for the calculation of the field were of the order of +0.63 cm/yr, +0.22 cm/yr for the velocity values in latitude and longitude, which is a good determination velocity values and estimating the empirical covariance function necessary for implementing the method of application. Furthermore, the grid used as the basis for interpolation provided differences of about -0.62 cm/yr, -0.12 cm/yr to latitude and longitude. Additionally, the results of this investigation were used as input to an approach in defining the boundary of Panama called block within the country of Costa Rica. The calculation of the components of the Euler pole through a routine program in MatLab and applied to different combinations of points gave no further contributions to the physical definition of this limit. The strategy was simply confirming the difference in the direction of all the velocity vectors and not allowed to reveal more detail revealed a location of this area within the country of Costa Rica.