949 resultados para Gossip columns


Relevância:

10.00% 10.00%

Publicador:

Resumo:

El projecte tracta de la realització de proves de càrrega al Claustre de la Catedral de Girona per tal de conèixer la càrrega que suporten les columnes de pedra del claustre. Un cop coneguda aquesta càrrega, es podrà decidir si son aptes o no per al canvi d’ús previst a la planta superior i es podrà aportar una resposta definitiva a una de les dues hipòtesis plantejades: si es forma o no arc de descàrrega a les columnes del Claustre. D’altre banda, es realitzarà una proposta de tractament de les columnes del Claustre que consistirà a escollir les que hauran de ser reparades i les que hauran de ser substituïdes, en base a una sèrie de càlculs numèrics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural concrete is one of the most commonly used construction materials in the United States. However, due to changes in design specifications, aging, vehicle impact, etc. – there is a need for new procedures for repairing concrete (reinforced or pretressed) superstructures and substructures. Thus, the overall objective of this investigation was to develop innovative cost effective repair methods for various concrete elements. In consultation with the project advisory committee, it was decided to evaluate the following three repair methods: • Carbon fiber reinforced polymers (CFRPs) for use in repairing damaged prestressed concrete bridges • Fiber reinforced polymers (FRPs) for preventing chloride penetration of bridge columns • Various patch materials The initial results of these evaluations are presented in this three volume final report. Each evaluation is briefly described in the following paragraphs. A more detailed abstract of each evaluation accompanies the volume on that particular investigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural concrete is one of the most commonly used construction materials in the United States. However, due to changes in design specifications, aging, vehicle impact, etc. – there is a need for new procedures for repairing concrete (reinforced or pretressed) superstructures and substructures. Thus, the overall objective of this investigation was to develop innovative cost effective repair methods for various concrete elements. In consultation with the project advisory committee, it was decided to evaluate the following three repair methods: • Carbon fiber reinforced polymers (CFRPs) for use in repairing damaged prestressed concrete bridges • Fiber reinforced polymers (FRPs) for preventing chloride penetration of bridge columns • Various patch materials The initial results of these evaluations are presented in this three volume final report. Each evaluation is briefly described in the following paragraphs. A more detailed abstract of each evaluation accompanies the volume on that particular investigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stability berms are commonly constructed where roadway embankments cross soft or unstable ground conditions. Under certain circumstances, the construction of stability berms cause unfavorable environmental impacts, either directly or indirectly, through their effect on wetlands, endangered species habitat, stream channelization, longer culvert lengths, larger right-of-way purchases, and construction access limits. Due to an ever more restrictive regulatory environment, these impacts are problematic. The result is the loss of valuable natural resources to the public, lengthy permitting review processes for the department of transportation and permitting agencies, and the additional expenditures of time and money for all parties. The purpose of this project was to review existing stability berm alternatives for potential use in environmentally sensitive areas. The project also evaluates how stabilization technologies are made feasible, desirable, and cost-effective for transportation projects and determines which alternatives afford practical solutions for avoiding and minimizing impacts to environmentally sensitive areas. An online survey of engineers at state departments of transportation was also conducted to assess the frequency and cost effectiveness of the various stabilization technologies. Geotechnical engineers that responded to the survey overwhelmingly use geosynthetic reinforcement as a suitable and cost-effective solution for stabilizing embankments and cut slopes. Alternatively, chemical stabilization and installation of lime/cement columns is rarely a remediation measure employed by state departments of transportation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report four patients who presented with a severe form of metaphyseal chondromatosis in association with D-2-hydroxyglutaric aciduria (D-2-HGA). All patients showed splaying columns of irregular ossification defects with bulbous metaphyses of the long tubular bones, as well as remarkable involvement of the short tubular and flat bones. The vertebral bodies revealed platyspondyly with irregular, stippled endplates. D-2-HGA has been described as a neurometabolic disorder manifesting a broad range of impairment in mental and motor development. Although hydroxyglutaric acid was excreted in high amounts in the urine of all four patients described herein, no significant neurologic abnormalities were evident. This unusual combination of characteristic skeletal and metabolic abnormalities has rarely been reported. Thus, our report will facilitate the recognition of this distinctive entity, and we suggest that a urine organic acid screening be obtained in patients who present with generalized enchondromatosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nearly full-length Circumsporozoite protein (CSP) from Plasmodium falciparum, the C-terminal fragments from both P. falciparm and P. yoelii CSP and a fragment comprising 351 amino acids of P.vivax MSPI were expressed in the slime mold Dictyostelium discoideum. Discoidin-tag expression vectors allowed both high yields of these proteins and their purification by a nearly single-step procedure. We exploited the galactose binding activity of Discoidin Ia to separate the fusion proteins by affinity chromatography on Sepharose-4B columns. Inclusion of a thrombin recognition site allowed cleavage of the Discoidin-tag from the fusion protein. Partial secretion of the protein was obtained via an ER independent pathway, whereas routing the recombinant proteins to the ER resulted in glycosylation and retention. Yields of proteins ranged from 0.08 to 3 mg l(-1) depending on the protein sequence and the purification conditions. The recognition of purified MSPI by sera from P. vivax malaria patients was used to confirm the native conformation of the protein expressed in Dictyostelium. The simple purification procedure described here, based on Sepharose-4B, should facilitate the expression and the large-scale purification of various Plasmodium polypeptides.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biplots are graphical displays of data matrices based on the decomposition of a matrix as the product of two matrices. Elements of these two matrices are used as coordinates for the rows and columns of the data matrix, with an interpretation of the joint presentation that relies on the properties of the scalar product. Because the decomposition is not unique, there are several alternative ways to scale the row and column points of the biplot, which can cause confusion amongst users, especially when software packages are not united in their approach to this issue. We propose a new scaling of the solution, called the standard biplot, which applies equally well to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. The standard biplot also handles data matrices with widely different levels of inherent variance. Two concepts taken from correspondence analysis are important to this idea: the weighting of row and column points, and the contributions made by the points to the solution. In the standard biplot one set of points, usually the rows of the data matrix, optimally represent the positions of the cases or sample units, which are weighted and usually standardized in some way unless the matrix contains values that are comparable in their raw form. The other set of points, usually the columns, is represented in accordance with their contributions to the low-dimensional solution. As for any biplot, the projections of the row points onto vectors defined by the column points approximate the centred and (optionally) standardized data. The method is illustrated with several examples to demonstrate how the standard biplot copes in different situations to give a joint map which needs only one common scale on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot readable. The proposal also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification of post-translational modifications of proteins in biological samples often requires access to preanalytical purification and concentration methods. In the purification step high or low molecular weight substances can be removed by size exclusion filters, and high abundant proteins can be removed, or low abundant proteins can be enriched, by specific capturing tools. In this paper is described the experience and results obtained with a recently emerged and easy-to-use affinity purification kit for enrichment of the low amounts of EPO found in urine and plasma specimens. The kit can be used as a pre-step in the EPO doping control procedure, as an alternative to the commonly used ultrafiltration, for detecting aberrantly glycosylated isoforms. The commercially available affinity purification kit contains small disposable anti-EPO monolith columns (6 ?L volume, Ø7 mm, length 0.15 mm) together with all required buffers. A 24-channel vacuum manifold was used for simultaneous processing of samples. The column concentrated EPO from 20 mL urine down to 55 ?L eluate with a concentration factor of 240 times, while roughly 99.7% of non-relevant urine proteins were removed. The recoveries of Neorecormon (epoetin beta), and the EPO analogues Aranesp and Mircera applied to buffer were high, 76%, 67% and 57%, respectively. The recovery of endogenous EPO from human urine was 65%. High recoveries were also obtained when purifying human, mouse and equine EPO from serum, and human EPO from cerebrospinal fluid. Evaluation with the accredited EPO doping control method based on isoelectric focusing (IEF) showed that the affinity purification procedure did not change the isoform distribution for rhEPO, Aranesp, Mircera or endogenous EPO. The kit should be particularly useful for applications in which it is essential to avoid carry-over effects, a problem commonly encountered with conventional particle-based affinity columns. The encouraging results with EPO propose that similar affinity monoliths, with the appropriate antibodies, should constitute useful tools for general applications in sample preparation, not only for doping control of EPO and other hormones such as growth hormone and insulin but also for the study of post-translational modifications of other low abundance proteins in biological and clinical research, and for sample preparation prior to in vitro diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Models incorporating more realistic models of customer behavior, as customers choosing froman offer set, have recently become popular in assortment optimization and revenue management.The dynamic program for these models is intractable and approximated by a deterministiclinear program called the CDLP which has an exponential number of columns. However, whenthe segment consideration sets overlap, the CDLP is difficult to solve. Column generationhas been proposed but finding an entering column has been shown to be NP-hard. In thispaper we propose a new approach called SDCP to solving CDLP based on segments and theirconsideration sets. SDCP is a relaxation of CDLP and hence forms a looser upper bound onthe dynamic program but coincides with CDLP for the case of non-overlapping segments. Ifthe number of elements in a consideration set for a segment is not very large (SDCP) can beapplied to any discrete-choice model of consumer behavior. We tighten the SDCP bound by(i) simulations, called the randomized concave programming (RCP) method, and (ii) by addingcuts to a recent compact formulation of the problem for a latent multinomial-choice model ofdemand (SBLP+). This latter approach turns out to be very effective, essentially obtainingCDLP value, and excellent revenue performance in simulations, even for overlapping segments.By formulating the problem as a separation problem, we give insight into why CDLP is easyfor the MNL with non-overlapping considerations sets and why generalizations of MNL posedifficulties. We perform numerical simulations to determine the revenue performance of all themethods on reference data sets in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Models incorporating more realistic models of customer behavior, as customers choosing from an offerset, have recently become popular in assortment optimization and revenue management. The dynamicprogram for these models is intractable and approximated by a deterministic linear program called theCDLP which has an exponential number of columns. When there are products that are being consideredfor purchase by more than one customer segment, CDLP is difficult to solve since column generationis known to be NP-hard. However, recent research indicates that a formulation based on segments withcuts imposing consistency (SDCP+) is tractable and approximates the CDLP value very closely. In thispaper we investigate the structure of the consideration sets that make the two formulations exactly equal.We show that if the segment consideration sets follow a tree structure, CDLP = SDCP+. We give acounterexample to show that cycles can induce a gap between the CDLP and the SDCP+ relaxation.We derive two classes of valid inequalities called flow and synchronization inequalities to further improve(SDCP+), based on cycles in the consideration set structure. We give a numeric study showing theperformance of these cycle-based cuts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to interpret the biplot it is necessary to know which points usually variables are the ones that are important contributors to the solution, and this information is available separately as part of the biplot s numerical results. We propose a new scaling of the display, called the contribution biplot, which incorporates this diagnostic directly into the graphical display, showing visually the important contributors and thus facilitating the biplot interpretation and often simplifying the graphical representation considerably. The contribution biplot can be applied to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. In the contribution biplot one set of points, usually the rows of the data matrix, optimally represent the spatial positions of the cases or sample units, according to some distance measure that usually incorporates some form of standardization unless all data are comparable in scale. The other set of points, usually the columns, is represented by vectors that are related to their contributions to the low-dimensional solution. A fringe benefit is that usually only one common scale for row and column points is needed on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot legible. Furthermore, this version of the biplot also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important, when they are in fact contributing minimally to the solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper establishes a general framework for metric scaling of any distance measure between individuals based on a rectangular individuals-by-variables data matrix. The method allows visualization of both individuals and variables as well as preserving all the good properties of principal axis methods such as principal components and correspondence analysis, based on the singular-value decomposition, including the decomposition of variance into components along principal axes which provide the numerical diagnostics known as contributions. The idea is inspired from the chi-square distance in correspondence analysis which weights each coordinate by an amount calculated from the margins of the data table. In weighted metric multidimensional scaling (WMDS) we allow these weights to be unknown parameters which are estimated from the data to maximize the fit to the original distances. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing a matrix and displaying its rows and columns in biplots.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the joint visualization of two matrices which have common rowsand columns, for example multivariate data observed at two time pointsor split accord-ing to a dichotomous variable. Methods of interest includeprincipal components analysis for interval-scaled data, or correspondenceanalysis for frequency data or ratio-scaled variables on commensuratescales. A simple result in matrix algebra shows that by setting up thematrices in a particular block format, matrix sum and difference componentscan be visualized. The case when we have more than two matrices is alsodiscussed and the methodology is applied to data from the InternationalSocial Survey Program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The network choice revenue management problem models customers as choosing from an offer-set, andthe firm decides the best subset to offer at any given moment to maximize expected revenue. The resultingdynamic program for the firm is intractable and approximated by a deterministic linear programcalled the CDLP which has an exponential number of columns. However, under the choice-set paradigmwhen the segment consideration sets overlap, the CDLP is difficult to solve. Column generation has beenproposed but finding an entering column has been shown to be NP-hard. In this paper, starting with aconcave program formulation based on segment-level consideration sets called SDCP, we add a class ofconstraints called product constraints, that project onto subsets of intersections. In addition we proposea natural direct tightening of the SDCP called ?SDCP, and compare the performance of both methodson the benchmark data sets in the literature. Both the product constraints and the ?SDCP method arevery simple and easy to implement and are applicable to the case of overlapping segment considerationsets. In our computational testing on the benchmark data sets in the literature, SDCP with productconstraints achieves the CDLP value at a fraction of the CPU time taken by column generation and webelieve is a very promising approach for quickly approximating CDLP when segment consideration setsoverlap and the consideration sets themselves are relatively small.