993 resultados para Continuous Operating Reference Stations
Resumo:
The clinical demand for a device to monitor Blood Pressure (BP) in ambulatory scenarios with minimal use of inflation cuffs is increasing. Based on the so-called Pulse Wave Velocity (PWV) principle, this paper introduces and evaluates a novel concept of BP monitor that can be fully integrated within a chest sensor. After a preliminary calibration, the sensor provides non-occlusive beat-by-beat estimations of Mean Arterial Pressure (MAP) by measuring the Pulse Transit Time (PTT) of arterial pressure pulses travelling from the ascending aorta towards the subcutaneous vasculature of the chest. In a cohort of 15 healthy male subjects, a total of 462 simultaneous readings consisting of reference MAP and chest PTT were acquired. Each subject was recorded at three different days: D, D+3 and D+14. Overall, the implemented protocol induced MAP values to range from 80 ± 6 mmHg in baseline, to 107 ± 9 mmHg during isometric handgrip maneuvers. Agreement between reference and chest-sensor MAP values was tested by using intraclass correlation coefficient (ICC = 0.78) and Bland-Altman analysis (mean error = 0.7 mmHg, standard deviation = 5.1 mmHg). The cumulative percentage of MAP values provided by the chest sensor falling within a range of ±5 mmHg compared to reference MAP readings was of 70%, within ±10 mmHg was of 91%, and within ±15mmHg was of 98%. These results point at the fact that the chest sensor complies with the British Hypertension Society (BHS) requirements of Grade A BP monitors, when applied to MAP readings. Grade A performance was maintained even two weeks after having performed the initial subject-dependent calibration. In conclusion, this paper introduces a sensor and a calibration strategy to perform MAP measurements at the chest. The encouraging performance of the presented technique paves the way towards an ambulatory-compliant, continuous and non-occlusive BP monitoring system.
Resumo:
Background: The GENCODE consortium was formed to identify and map all protein-coding genes within the ENCODE regions. This was achieved by a combination of initial manualannotation by the HAVANA team, experimental validation by the GENCODE consortium and a refinement of the annotation based on these experimental results.Results: The GENCODE gene features are divided into eight different categories of which onlythe first two (known and novel coding sequence) are confidently predicted to be protein-codinggenes. 5’ rapid amplification of cDNA ends (RACE) and RT-PCR were used to experimentallyverify the initial annotation. Of the 420 coding loci tested, 229 RACE products have beensequenced. They supported 5’ extensions of 30 loci and new splice variants in 50 loci. In addition,46 loci without evidence for a coding sequence were validated, consisting of 31 novel and 15putative transcripts. We assessed the comprehensiveness of the GENCODE annotation byattempting to validate all the predicted exon boundaries outside the GENCODE annotation. Outof 1,215 tested in a subset of the ENCODE regions, 14 novel exon pairs were validated, only twoof them in intergenic regions.Conclusions: In total, 487 loci, of which 434 are coding, have been annotated as part of theGENCODE reference set available from the UCSC browser. Comparison of GENCODEannotation with RefSeq and ENSEMBL show only 40% of GENCODE exons are contained withinthe two sets, which is a reflection of the high number of alternative splice forms with uniqueexons annotated. Over 50% of coding loci have been experimentally verified by 5’ RACE forEGASP and the GENCODE collaboration is continuing to refine its annotation of 1% humangenome with the aid of experimental validation.
Resumo:
Reliable estimates of heavy-truck volumes are important in a number of transportation applications. Estimates of truck volumes are necessary for pavement design and pavement management. Truck volumes are important in traffic safety. The number of trucks on the road also influences roadway capacity and traffic operations. Additionally, heavy vehicles pollute at higher rates than passenger vehicles. Consequently, reliable estimates of heavy-truck vehicle miles traveled (VMT) are important in creating accurate inventories of on-road emissions. This research evaluated three different methods to calculate heavy-truck annual average daily traffic (AADT) which can subsequently be used to estimate vehicle miles traveled (VMT). Traffic data from continuous count stations provided by the Iowa DOT were used to estimate AADT for two different truck groups (single-unit and multi-unit) using the three methods. The first method developed monthly and daily expansion factors for each truck group. The second and third methods created general expansion factors for all vehicles. Accuracy of the three methods was compared using n-fold cross-validation. In n-fold cross-validation, data are split into n partitions, and data from the nth partition are used to validate the remaining data. A comparison of the accuracy of the three methods was made using the estimates of prediction error obtained from cross-validation. The prediction error was determined by averaging the squared error between the estimated AADT and the actual AADT. Overall, the prediction error was the lowest for the method that developed expansion factors separately for the different truck groups for both single- and multi-unit trucks. This indicates that use of expansion factors specific to heavy trucks results in better estimates of AADT, and, subsequently, VMT, than using aggregate expansion factors and applying a percentage of trucks. Monthly, daily, and weekly traffic patterns were also evaluated. Significant variation exists in the temporal and seasonal patterns of heavy trucks as compared to passenger vehicles. This suggests that the use of aggregate expansion factors fails to adequately describe truck travel patterns.
Resumo:
RESUME La ventilation en pression positive continue (Continuous Positive Airway Pressure, CPAP), utilisée pour la première fois chez. les prématurés en 1971, est une technique actuellement très largement utilisée dans les unités néonatales. L'utilisation de la CPAP présente de nombreux avantages à court terme: diminution de la fraction maximale inspirée d'oxygène, de la durée de l'oxygénothérapie, du taux d'intubation et donc du recours à une ventilation mécanique, réduction de l'utilisation des amines, des curares et de la morphine, possible prévention de l'apparition d'une bronchodysplasie pulmonaire, et possibles réductions du nombre d'infections postnatales et des entérocolites nécrosantes. Mais peu d'études existent concernant les effets à long terme de la CPAP sur le neurodéveloppement et la croissance, qui constituent l'objectif de la présente étude. L'utilisation systématique de la CPAP comme alternative à la ventilation mécanique a été introduite à Lausanne en 1998. La population cible de cette étude est constituée des prématurés nés à moins de 32 semaines de gestation ou pesant moins de 1500 g à la naissance; ils ont été systématiquement suivis jusqu'en âge préscolaire dans le cadre de l'étude de cohorte «Unité de Développement, CHUV». L'originalité de ce travail réside dans le fait d'évaluer le neurodéveloppement et la croissance à long terme d'enfants prématurés traités préférentiellement avec la CPAP, en comparaison avec un groupe historique contrôle traité par d'autres modes ventilatoires, en tenant compte de nombreux autres paramètres néonataux. Du point de vue du neurodéveloppement, l'usage, de la CPAP montre une tendance à diminuer l'incidence d'hémorragie intraventriculaire et le risque d'avoir un mauvais neurodéveloppement à 6 mois. Ces résultats positifs sur le neurodéveloppement s'estompent à l'âge de 18 mois, où persiste néanmoins une valeur plus élevée du quotient de développement, et disparaissent complètement en âge préscolaire. Du point de vue de la croissance, l'utilisation de la CPAP ne montre aucun effet sur le poids, mais par contre un effet positif sur la taille à 6 et 18 mois et sur le périmètre crânien à 6 mois, 18 mois et en âge préscolaire. Malgré le caractère non randomisé de cette étude, les résultats positifs obtenus ici permettent sans conteste de s'assurer d'une utilisation de la CPAP sans effet négatif sur le neurodéveloppement et la croissance, et fournissent donc un argument supplémentaire pour l'utilisation systématique de la CPAP chez les prématurés.
Resumo:
The optimization of the pilot overhead in single-user wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-used block fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with the normalized Doppler frequency multiplied by the number of transmit antennas.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
The GENCODE Consortium aims to identify all gene features in the human genome using a combination of computational analysis, manual annotation, and experimental validation. Since the first public release of this annotation data set, few new protein-coding loci have been added, yet the number of alternative splicing transcripts annotated has steadily increased. The GENCODE 7 release contains 20,687 protein-coding and 9640 long noncoding RNA loci and has 33,977 coding transcripts not represented in UCSC genes and RefSeq. It also has the most comprehensive annotation of long noncoding RNA (lncRNA) loci publicly available with the predominant transcript form consisting of two exons. We have examined the completeness of the transcript annotation and found that 35% of transcriptional start sites are supported by CAGE clusters and 62% of protein-coding genes have annotated polyA sites. Over one-third of GENCODE protein-coding genes are supported by peptide hits derived from mass spectrometry spectra submitted to Peptide Atlas. New models derived from the Illumina Body Map 2.0 RNA-seq data identify 3689 new loci not currently in GENCODE, of which 3127 consist of two exon models indicating that they are possibly unannotated long noncoding loci. GENCODE 7 is publicly available from gencodegenes.org and via the Ensembl and UCSC Genome Browsers.
Resumo:
In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.
Resumo:
Biological reference points are important tools for fisheries management. Reference points are not static, but may change when a population's environment or the population itself changes. Fisheries-induced evolution is one mechanism that can alter population characteristics, leading to "shifting" reference points by modifying the underlying biological processes or by changing the perception of a fishery system. The former causes changes in "true" reference points, whereas the latter is caused by changes in the yardsticks used to quantify a system's status. Unaccounted shifts of either kind imply that reference points gradually lose their intended meaning. This can lead to increased precaution, which is safe, but potentially costly. Shifts can also occur in more perilous directions, such that actual risks are greater than anticipated. Our qualitative analysis suggests that all commonly used reference points are susceptible to shifting through fisheries-induced evolution, including the limit and "precautionary" reference points for spawning-stock biomass, Blim and Bpa, and the target reference point for fishing mortality, F0.1. Our findings call for increased awareness of fisheries-induced changes and highlight the value of always basing reference points on adequately updated information, to capture all changes in the biological processes that drive fish population dynamics.
Resumo:
AIMS: Common carotid artery intima-media thickness (CCIMT) is widely used as a surrogate marker of atherosclerosis, given its predictive association with cardiovascular disease (CVD). The interpretation of CCIMT values has been hampered by the absence of reference values, however. We therefore aimed to establish reference intervals of CCIMT, obtained using the probably most accurate method at present (i.e. echotracking), to help interpretation of these measures. METHODS AND RESULTS: We combined CCIMT data obtained by echotracking on 24 871 individuals (53% men; age range 15-101 years) from 24 research centres worldwide. Individuals without CVD, cardiovascular risk factors (CV-RFs), and BP-, lipid-, and/or glucose-lowering medication constituted a healthy sub-population (n = 4234) used to establish sex-specific equations for percentiles of CCIMT across age. With these equations, we generated CCIMT Z-scores in different reference sub-populations, thereby allowing for a standardized comparison between observed and predicted ('normal') values from individuals of the same age and sex. In the sub-population without CVD and treatment (n = 14 609), and in men and women, respectively, CCIMT Z-scores were independently associated with systolic blood pressure [standardized βs 0.19 (95% CI: 0.16-0.22) and 0.18 (0.15-0.21)], smoking [0.25 (0.19-0.31) and 0.11 (0.04-0.18)], diabetes [0.19 (0.05-0.33) and 0.19 (0.02-0.36)], total-to-HDL cholesterol ratio [0.07 (0.04-0.10) and 0.05 (0.02-0.09)], and body mass index [0.14 (0.12-0.17) and 0.07 (0.04-0.10)]. CONCLUSION: We estimated age- and sex-specific percentiles of CCIMT in a healthy population and assessed the association of CV-RFs with CCIMT Z-scores, which enables comparison of IMT values for (patient) groups with different cardiovascular risk profiles, helping interpretation of such measures obtained both in research and clinical settings.
Resumo:
Continuous respiratory exchange measurements were performed on five women and five men for 1 h before and 6 h after the administration of a milkshake (53% carbohydrates, 30% lipid, and 17% protein energy) given either as a single bolus dose or continuously during 3 h using a nasogastric tube. The energy administered corresponded to 2.3 times the postabsorptive resting energy expenditure. Resting energy expenditure, respiratory quotient, plasma glucose, and insulin concentrations increased sooner and steeper, and plasma free fatty acids levels decreased earlier with the meal ingested as a single dose than with continuous administration. The magnitude of nutrient-induced thermogenesis was greater (P less than 0.01) with the single dose (means +/- SE, 10.0 +/- 0.6%) than with the continuous administration (8.1 +/- 0.5%). The overall (6 h) substrate balances were not significantly different between the two modes of administration. It is concluded that the mode of enteral nutrient administration influences the immediate thermogenic response as well as changes in respiratory quotient, glycemia, and insulinemia; however, the overall nutrient balance was not affected by the mode of enteral nutrient administration.
Resumo:
Financial markets play an important role in an economy performing various functions like mobilizing and pooling savings, producing information about investment opportunities, screening and monitoring investments, implementation of corporate governance, diversification and management of risk. These functions influence saving rates, investment decisions, technological innovation and, therefore, have important implications for welfare. In my PhD dissertation I examine the interplay of financial and product markets by looking at different channels through which financial markets may influence an economy.My dissertation consists of four chapters. The first chapter is a co-authored work with Martin Strieborny, a PhD student from the University of Lausanne. The second chapter is a co-authored work with Melise Jaud, a PhD student from the Paris School of Economics. The third chapter is co-authored with both Melise Jaud and Martin Strieborny. The last chapter of my PhD dissertation is a single author paper.Chapter 1 of my PhD thesis analyzes the effect of financial development on growth of contract intensive industries. These industries intensively use intermediate inputs that neither can be sold on organized exchange, nor are reference-priced (Levchenko, 2007; Nunn, 2007). A typical example of a contract intensive industry would be an industry where an upstream supplier has to make investments in order to customize a product for needs of a downstream buyer. After the investment is made and the product is adjusted, the buyer may refuse to meet a commitment and trigger ex post renegotiation. Since the product is customized to the buyer's needs, the supplier cannot sell the product to a different buyer at the original price. This is referred in the literature as the holdup problem. As a consequence, the individually rational suppliers will underinvest into relationship-specific assets, hurting the downstream firms with negative consequences for aggregate growth. The standard way to mitigate the hold up problem is to write a binding contract and to rely on the legal enforcement by the state. However, even the most effective contract enforcement might fail to protect the supplier in tough times when the buyer lacks a reliable source of external financing. This suggests the potential role of financial intermediaries, banks in particular, in mitigating the incomplete contract problem. First, financial products like letters of credit and letters of guarantee can substantially decrease a risk and transaction costs of parties. Second, a bank loan can serve as a signal about a buyer's true financial situation, an upstream firm will be more willing undertake relationship-specific investment knowing that the business partner is creditworthy and will abstain from myopic behavior (Fama, 1985; von Thadden, 1995). Therefore, a well-developed financial (especially banking) system should disproportionately benefit contract intensive industries.The empirical test confirms this hypothesis. Indeed, contract intensive industries seem to grow faster in countries with a well developed financial system. Furthermore, this effect comes from a more developed banking sector rather than from a deeper stock market. These results are reaffirmed examining the effect of US bank deregulation on the growth of contract intensive industries in different states. Beyond an overall pro-growth effect, the bank deregulation seems to disproportionately benefit the industries requiring relationship-specific investments from their suppliers.Chapter 2 of my PhD focuses on the role of the financial sector in promoting exports of developing countries. In particular, it investigates how credit constraints affect the ability of firms operating in agri-food sectors of developing countries to keep exporting to foreign markets.Trade in high-value agri-food products from developing countries has expanded enormously over the last two decades offering opportunities for development. However, trade in agri-food is governed by a growing array of standards. Sanitary and Phytosanitary standards (SPS) and technical regulations impose additional sunk, fixed and operating costs along the firms' export life. Such costs may be detrimental to firms' survival, "pricing out" producers that cannot comply. The existence of these costs suggests a potential role of credit constraints in shaping the duration of trade relationships on foreign markets. A well-developed financial system provides the funds to exporters necessary to adjust production processes in order to meet quality and quantity requirements in foreign markets and to maintain long-standing trade relationships. The products with higher needs for financing should benefit the most from a well functioning financial system. This differential effect calls for a difference-in-difference approach initially proposed by Rajan and Zingales (1998). As a proxy for demand for financing of agri-food products, the sanitary risk index developed by Jaud et al. (2009) is used. The empirical literature on standards and norms show high costs of compliance, both variable and fixed, for high-value food products (Garcia-Martinez and Poole, 2004; Maskus et al., 2005). The sanitary risk index reflects the propensity of products to fail health and safety controls on the European Union (EU) market. Given the high costs of compliance, the sanitary risk index captures the demand for external financing to comply with such regulations.The prediction is empirically tested examining the export survival of different agri-food products from firms operating in Ghana, Mali, Malawi, Senegal and Tanzania. The results suggest that agri-food products that require more financing to keep up with food safety regulation of the destination market, indeed sustain longer in foreign market, when they are exported from countries with better developed financial markets.Chapter 3 analyzes the link between financial markets and efficiency of resource allocation in an economy. Producing and exporting products inconsistent with a country's factor endowments constitutes a serious misallocation of funds, which undermines competitiveness of the economy and inhibits its long term growth. In this chapter, inefficient exporting patterns are analyzed through the lens of the agency theories from the corporate finance literature. Managers may pursue projects with negative net present values because their perquisites or even their job might depend on them. Exporting activities are particularly prone to this problem. Business related to foreign markets involves both high levels of additional spending and strong incentives for managers to overinvest. Rational managers might have incentives to push for exports that use country's scarce factors which is suboptimal from a social point of view. Export subsidies might further skew the incentives towards inefficient exporting. Management can divert the export subsidies into investments promoting inefficient exporting.Corporate finance literature stresses the disciplining role of outside debt in counteracting the internal pressures to divert such "free cash flow" into unprofitable investments. Managers can lose both their reputation and the control of "their" firm if the unpaid external debt triggers a bankruptcy procedure. The threat of possible failure to satisfy debt service payments pushes the managers toward an efficient use of available resources (Jensen, 1986; Stulz, 1990; Hart and Moore, 1995). The main sources of debt financing in the most countries are banks. The disciplining role of banks might be especially important in the countries suffering from insufficient judicial quality. Banks, in pursuing their rights, rely on comparatively simple legal interventions that can be implemented even by mediocre courts. In addition to their disciplining role, banks can promote efficient exporting patterns in a more direct way by relaxing credit constraints of producers, through screening, identifying and investing in the most profitable investment projects. Therefore, a well-developed domestic financial system, and particular banking system, would help to push a country's exports towards products congruent with its comparative advantage.This prediction is tested looking at the survival of different product categories exported to US market. Products are identified according to the Euclidian distance between their revealed factor intensity and the country's factor endowments. The results suggest that products suffering from a comparative disadvantage (labour-intensive products from capital-abundant countries) survive less on the competitive US market. This pattern is stronger if the exporting country has a well-developed banking system. Thus, a strong banking sector promotes exports consistent with a country comparative advantage.Chapter 4 of my PhD thesis further examines the role of financial markets in fostering efficient resource allocation in an economy. In particular, the allocative efficiency hypothesis is investigated in the context of equity market liberalization.Many empirical studies document a positive and significant effect of financial liberalization on growth (Levchenko et al. 2009; Quinn and Toyoda 2009; Bekaert et al., 2005). However, the decrease in the cost of capital and the associated growth in investment appears rather modest in comparison to the large GDP growth effect (Bekaert and Harvey, 2005; Henry, 2000, 2003). Therefore, financial liberalization may have a positive impact on growth through its effect on the allocation of funds across firms and sectors.Free access to international capital markets allows the largest and most profitable domestic firms to borrow funds in foreign markets (Rajan and Zingales, 2003). As domestic banks loose some of their best clients, they reoptimize their lending practices seeking new clients among small and younger industrial firms. These firms are likely to be more risky than large and established companies. Screening of customers becomes prevalent as the return to screening rises. Banks, ceteris paribus, tend to focus on firms operating in comparative-advantage sectors because they are better risks. Firms in comparative-disadvantage sectors finding it harder to finance their entry into or survival in export markets either exit or refrain from entering export markets. On aggregate, one should therefore expect to see less entry, more exit, and shorter survival on export markets in those sectors after financial liberalization.The paper investigates the effect of financial liberalization on a country's export pattern by comparing the dynamics of entry and exit of different products in a country export portfolio before and after financial liberalization.The results suggest that products that lie far from the country's comparative advantage set tend to disappear relatively faster from the country's export portfolio following the liberalization of financial markets. In other words, financial liberalization tends to rebalance the composition of a country's export portfolio towards the products that intensively use the economy's abundant factors.
Resumo:
In a series of three experiments, participants made inferences about which one of a pair of two objects scored higher on a criterion. The first experiment was designed to contrast the prediction of Probabilistic Mental Model theory (Gigerenzer, Hoffrage, & Kleinbölting, 1991) concerning sampling procedure with the hard-easy effect. The experiment failed to support the theory's prediction that a particular pair of randomly sampled item sets would differ in percentage correct; but the observation that German participants performed practically as well on comparisons between U.S. cities (many of which they did not even recognize) than on comparisons between German cities (about which they knew much more) ultimately led to the formulation of the recognition heuristic. Experiment 2 was a second, this time successful, attempt to unconfound item difficulty and sampling procedure. In Experiment 3, participants' knowledge and recognition of each city was elicited, and how often this could be used to make an inference was manipulated. Choices were consistent with the recognition heuristic in about 80% of the cases when it discriminated and people had no additional knowledge about the recognized city (and in about 90% when they had such knowledge). The frequency with which the heuristic could be used affected the percentage correct, mean confidence, and overconfidence as predicted. The size of the reference class, which was also manipulated, modified these effects in meaningful and theoretically important ways.
Resumo:
To ensure that high-quality materials are used in concrete mixing, all materials delivered to the site should be inspected to ensure that they meet specification requirements. All materials should be delivered with the proper certifications, invoices, or bill of lading. These records should indicate when the shipment arrived, the amount and identification of material delivered, and the laboratory report certification number, invoice number, and ticket number.