937 resultados para STEPS
Resumo:
Ribosomal RNA (rRNA) contains a number of modified nucleosides in functionally important regions including the intersubunit bridge regions. As the activity of ribosome recycling factor (RRF) in separating the large and the small subunits of the ribosome involves disruption of intersubunit bridges, we investigated the impact of rRNA methylations on ribosome recycling. We show that deficiency of rRNA methylations, especially at positions 1518 and 1519 of 16S rRNA near the interface with the 50S subunit and in the vicinity of the IF3 binding site, adversely affects the efficiency of RRF-mediated ribosome recycling. In addition, we show that a compromise in the RRF activity affords increased initiation with a mutant tRNA(fMet) wherein the three consecutive G-C base pairs ((29)GGG(31):39CCC41), a highly conserved feature of the initiator tRNAs, were mutated to those found in the elongator tRNA(Met) ((29)UCA(31):(39)psi GA(41)). This observation has allowed us to uncover a new role of RRF as a factor that contributes to fidelity of initiator tRNA selection on the ribosome. We discuss these and earlier findings to propose that RRF plays a crucial role during all the steps of protein synthesis.
Resumo:
The National Road Safety Partnership Program (NRSPP) is an industry-led collaborative network which aims to support Australian businesses in developing a positive road safety culture. It aims to help businesses to protect their employees and the public, not only during work hours, but also when their staff are ‘off-duty’. How do we engage and help an organisation minimise work-related vehicle crashes and their consequences both internally, and within the broader community? The first step is helping an organisation to understand the true cost of its road incidents. Larger organisations often wear the costs without knowing the true impact to their bottom line. All they perceive is the change in insurance or vehicle repairs. Understanding the true cost should help mobilise a business’s leadership to do more. The next step is ensuring the business undertakes an informed, structured, evidence-based pathway which will guide them around the costly pitfalls. A pathway based around the safe system approach with buy-in at the top which brings the workforce along. The final step, benchmarking, allows the organisation to measure and track its change. This symposium will explore the pathway steps for organisations using NRSPP resources to become engaged in road safety. The 'Total Cost of Risk' calculator has been developed by Zurich, tested in Europe by Nestle and modified by NRSPP for Australia. This provides the first crucial step. The next step is a structured approach through the Workplace Road Safety Guide using experts and industry to discuss the preferred safe system approach which can then link into the national Benchmarking Project. The outputs from the symposium can help frame a pathway for organisations to follow through the NRSPP website.
Resumo:
Studies on the low temperature oxidation of polyolefins have been the subject matter of several investigations because of interest in understanding the aging and weathering of polymers. One of the key steps in such an oxtdatton is the formation of hydroperoxide. Estimation of the hydroperoxide in oxidized samples, which is conventionally done by iodometric titrations, is quite important to gain knowledge about the kinetics and mechanism of the process. The present investigation is the first report of the thermal analysis of polypropylene hydroperoxide samples from two angles: (1) the thermal behavior of its decomposition and (2) whether such an analysis leads to knowledge of the concentration of hydroperoxide in the sample.
Resumo:
We consider estimating the total load from frequent flow data but less frequent concentration data. There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates that minimizes the biases and makes use of informative predictive variables. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized rating-curve approach with additional predictors that capture unique features in the flow data, such as the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and the discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. Forming this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach for two rivers delivering to the Great Barrier Reef, Queensland, Australia. One is a data set from the Burdekin River, and consists of the total suspended sediment (TSS) and nitrogen oxide (NO(x)) and gauged flow for 1997. The other dataset is from the Tully River, for the period of July 2000 to June 2008. For NO(x) Burdekin, the new estimates are very similar to the ratio estimates even when there is no relationship between the concentration and the flow. However, for the Tully dataset, by incorporating the additional predictive variables namely the discounted flow and flow phases (rising or recessing), we substantially improved the model fit, and thus the certainty with which the load is estimated.
Resumo:
There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates by minimizing the biases and making use of possible predictive variables. The load estimation procedure can be summarized by the following four steps: - (i) output the flow rates at regular time intervals (e.g. 10 minutes) using a time series model that captures all the peak flows; - (ii) output the predicted flow rates as in (i) at the concentration sampling times, if the corresponding flow rates are not collected; - (iii) establish a predictive model for the concentration data, which incorporates all possible predictor variables and output the predicted concentrations at the regular time intervals as in (i), and; - (iv) obtain the sum of all the products of the predicted flow and the predicted concentration over the regular time intervals to represent an estimate of the load. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized regression (rating-curve) approach with additional predictors that capture unique features in the flow data, namely the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and cumulative discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. The model also has the capacity to accommodate autocorrelation in model errors which are the result of intensive sampling during floods. Incorporating this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach using the concentrations of total suspended sediment (TSS) and nitrogen oxide (NOx) and gauged flow data from the Burdekin River, a catchment delivering to the Great Barrier Reef. The sampling biases for NOx concentrations range from 2 to 10 times indicating severe biases. As we expect, the traditional average and extrapolation methods produce much higher estimates than those when bias in sampling is taken into account.
Resumo:
The feasibility of different modern analytical techniques for the mass spectrometric detection of anabolic androgenic steroids (AAS) in human urine was examined in order to enhance the prevalent analytics and to find reasonable strategies for effective sports drug testing. A comparative study of the sensitivity and specificity between gas chromatography (GC) combined with low (LRMS) and high resolution mass spectrometry (HRMS) in screening of AAS was carried out with four metabolites of methandienone. Measurements were done in selected ion monitoring mode with HRMS using a mass resolution of 5000. With HRMS the detection limits were considerably lower than with LRMS, enabling detection of steroids at low 0.2-0.5 ng/ml levels. However, also with HRMS, the biological background hampered the detection of some steroids. The applicability of liquid-phase microextraction (LPME) was studied with metabolites of fluoxymesterone, 4-chlorodehydromethyltestosterone, stanozolol and danazol. Factors affecting the extraction process were studied and a novel LPME method with in-fiber silylation was developed and validated for GC/MS analysis of the danazol metabolite. The method allowed precise, selective and sensitive analysis of the metabolite and enabled simultaneous filtration, extraction, enrichment and derivatization of the analyte from urine without any other steps in sample preparation. Liquid chromatographic/tandem mass spectrometric (LC/MS/MS) methods utilizing electrospray ionization (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) were developed and applied for detection of oxandrolone and metabolites of stanozolol and 4-chlorodehydromethyltestosterone in urine. All methods exhibited high sensitivity and specificity. ESI showed, however, the best applicability, and a LC/ESI-MS/MS method for routine screening of nine 17-alkyl-substituted AAS was thus developed enabling fast and precise measurement of all analytes with detection limits below 2 ng/ml. The potential of chemometrics to resolve complex GC/MS data was demonstrated with samples prepared for AAS screening. Acquired full scan spectral data (m/z 40-700) were processed by the OSCAR algorithm (Optimization by Stepwise Constraints of Alternating Regression). The deconvolution process was able to dig out from a GC/MS run more than the double number of components as compared with the number of visible chromatographic peaks. Severely overlapping components, as well as components hidden in the chromatographic background could be isolated successfully. All studied techniques proved to be useful analytical tools to improve detection of AAS in urine. Superiority of different procedures is, however, compound-dependent and different techniques complement each other.
Resumo:
The paper presents two new algorithms for the direct parallel solution of systems of linear equations. The algorithms employ a novel recursive doubling technique to obtain solutions to an nth-order system in n steps with no more than 2n(n −1) processors. Comparing their performance with the Gaussian elimination algorithm (GE), we show that they are almost 100% faster than the latter. This speedup is achieved by dispensing with all the computation involved in the back-substitution phase of GE. It is also shown that the new algorithms exhibit error characteristics which are superior to GE. An n(n + 1) systolic array structure is proposed for the implementation of the new algorithms. We show that complete solutions can be obtained, through these single-phase solution methods, in 5n−log2n−4 computational steps, without the need for intermediate I/O operations.
Resumo:
The aims of this investigation were to enumerate coliforms in fresh mangoes, puree, cheeks, and cheeks-in-puree in order to determine the source of these organisms in the processed products, to determine methods for their control, and to identify coliforms isolated from cheeks-in-puree to determine whether they have any public health significance. Product from four processors was tested on two occasions. The retail packs of cheeks-in-puree having the highest coliform counts were those in which raw puree was added to the cheeks. Coliform counts in these samples ranged between 1.4 × 103 and 5.4 × 104 cfu/g. Pasteurisation reduced the coliform count of raw puree to < 5 cfu/g. Forty-seven percent of the 73 colonies, isolated as coliforms on the basis of their colony morphology on violet red bile agar, were identified as Klebsiella pneumoniae using the ATB 32E Identification System. Klebsiella strains were tested for growth at 10 °C, faecal coliform response, and fermentation of -melizitose, to differentiate the three phenotypically similar strains, K. pneumoniae, K. terrigena and K planticola. Results indicated that 41% of K. pneumoniae isolates gave reactions typical of K. pneumoniae. A further 44% of strains gave an atypical reaction pattern for these tests and were designated ‘psychrotrophic’ K. pneumoniae. Klebsiella pneumoniae counts of between 2.1 × 103 and 4.9 × 104 cfu/g were predicted to occur in the retail packs of mango cheeks-in-puree produced by the processors who constituted this product with raw puree. In view of the opportunistic pathogenic nature of K. pneumoniae, its presence in these products is considered undesirable and steps, such as pasteurisation of puree, should be taken in order to inactivate it
Resumo:
THE rapid development of recombinant DNA technology has brought forth a revolution in biology'>", it aids us to have a closer look at the 'way genes are organized, eS11 ecially in the complex eucaryotic genornes'<", Although many animal and yeast genes have been studied in detail using recombinant DNA technology, plant genes have seldom been targets for such studie., Germination is an ideal process to study gene expression .because it effects a . shift in the metabolic status of seeds from a state of 'dormancy to an active one. AJ;l understanding of gene organization and regulation darin.g germination can be accomplblted by molecular cloning of DNA from seeds lik.e rice. To study the status of histone, rRNA tRNA and other genes in the rice genome, a general method was developed to clone eucarvotic DNA in a' plasmid vector pBR 322. This essentially ~ involves the following steps. The rice embryo and plasmid pBR 322 DNAs were cut witll restriction endonuclease Bam Hi to generate stick.Y ends, The plasmid DNA was puosphatased, the DNA~ ware a~·tnealed and joined 'by T4 phage DNA ligase. The recombinant DNA molecules thus produced were transjerred into E. coli and colonies containing them Were selected by their sensitivity to tetracycline and resistance to ampicillin, Two clones were identified . 2S haVing tRNA genes by hybridization of the DNA in the clones \vitl1 32P-la.belled rice tRNAs.
Resumo:
Background From the conservative estimates of registrants with the National Diabetes Supply Scheme, we will be soon passing 1.1 Million Australians affected by all types of diabetes. The diabetes complications of foot ulceration and amputation are costly to all. These costs can be reduced with appropriate prevention strategies, starting with identifying people at risk through primary care diabetic foot screening. Yet levels of diabetic foot screening in Australia are difficult to quantify. This presentation aims to report on foot screening rates as recorded in existing academic literature, national health surveys and national database reports. Methods Literature searches included diabetic foot screening that occurred in the primary care setting for populations over 2000 people from 2002 to 2014. Searches were performed using Medline and CINAHL as well as internet searches of Organisations for Economic Co-operation and Development (OECD) countries health databases. The focus is on type 1 and type 2 diabetes in adults, and not gestational diabetes or children. The two primary outcome measures were foot -screening rates as a percentage of adult diabetic population and major lower limb amputation incidence rates from standardised OECD data. Results The most recent and accurate level for Australian population review was in the AUSDIAB (Australian Diabetes and lifestyle survey) from 2004. This survey reported screening in primary care to be as low as 50%. Countries such as the United Kingdom and United States of America have much higher reported rates of foot screening (67-86%) recorded using national databases and web based initiatives that involve patients and clinicians. By comparison major amputation rates for Australia were similar to the United Kingdom at 6.5 versus 5.1 per 100,000 population, but dis-similar to the United States of America at 17 per 100,000 population. Conclusions Australian rates of diabetic foot screening in primary care centres is ambiguous. There is no direct relationship between foot screening levels in a primary care environment and major lower limb amputation, based on national health survey's and OECD data. Uptake of national registers, incentives and web-based systems improve levels of diabetic foot assessment, which are the first steps to a healthier diabetic population.
Resumo:
The intent of this study was to design, document and implement a Quality Management System (QMS) into a laboratory that incorporated both research and development (R&D) and routine analytical activities. In addition, it was necessary for the QMS to be easily and efficiently maintained to: (a) provide documented evidence that would validate the system's compliance with a certifiable standard, (b) fit the purpose of the laboratory, (c) accommodate prevailing government policies and standards, and (d) promote positive outcomes for the laboratory through documentation and verification of the procedures and methodologies implemented. Initially, a matrix was developed that documented the standards' requirements and the necessary steps to be made to meet those requirements. The matrix provided a check mechanism on the progression of the system's development. In addition, it was later utilised in the Quality Manual as a reference tool for the location of full procedures documented elsewhere in the system. The necessary documentation to build and monitor the system consisted of a series of manuals along with forms that provided auditable evidence of the workings of the QMS. Quality Management (QM), in one form or another, has been in existence since the early 1900's. However, the question still remains: is it a good thing or just a bugbear? Many of the older style systems failed because they were designed by non-users, fiercely regulatory, restrictive and generally deemed to be an imposition. It is now considered important to foster a sense of ownership of the system by the people who use the system. The system's design must be tailored to best fit the purpose of the operations of the facility if maximum benefits to the organisation are to be gained.
Resumo:
Weighing lysimeters are the standard method for directly measuring evapotranspiration (ET). This paper discusses the construction, installation, and performance of two (1.52 m × 1.52 m × 2.13-m deep) repacked weighing lysimeters for measuring ET of corn and soybean in West Central Nebraska. The cost of constructing and installing each lysimeter was approximately US $12,500, which could vary depending on the availability and cost of equipment and labor. The resolution of the lysimeters was 0.0001 mV V-1, which was limited by the data processing and storage resolution of the datalogger. This resolution was equivalent to 0.064 and 0.078 mm of ET for the north and south lysimeters, respectively. Since the percent measurement error decreases with the magnitude of the ET measured, this resolution is adequate for measuring ET for daily and longer periods, but not for shorter time steps. This resolution would result in measurement errors of less than 5% for measuring ET values of ≥3 mm, but the percent error rapidly increases for lower ET values. The resolution of the lysimeters could potentially be improved by choosing a datalogger that could process and store data with a higher resolution than the one used in this study.
Resumo:
Background Previous studies (mostly questionnaire-based in children) suggest that outdoor activity is protective against myopia. There are few studies on young adults investigating both the impact of simply being outdoors versus performing physical activity. The aim was to study the relationship between the refractive error of young adults and their physical activity patterns. Methods Twenty-seven university students, aged 18 to 25 years, wore a pedometer (Omron HJ720ITE) for seven days both during the semester and holiday periods. They simultaneously recorded the type of activity performed, its duration, the number of steps taken (from the pedometer) and their location (indoors/outdoors) in a logbook. Mean spherical refractive error was used to divide participants into three groups (emmetropes: +1.00 to -0.50 D, low myopes: -0.62 to -3.00 D, higher myopes: -3.12 D or greater myopia). Results There were no significant differences between the refractive groups during the semester or holiday periods; the average daily times spent outdoors, the duration of physical activity, the ratio of physical activity performed outdoors to indoors and amount of near work performed were similar. The peak exercise intensity was similar across all groups: approximately 100 steps perminute, a brisk walk. Up to one-third of all physical activity was performed outdoors. There were some significant differences in activities performed during semester and holiday times. For example, lowmyopes spent significantly less time outside (49 ± 47 versus 74 ± 41 minutes, p = 0.005) and performed less physical activity (6,388 ± 1,747 versus 6,779 ± 2,746 steps per day; p = 0.03) during the holidays compared to during semester. Conclusions The fact that all groups had similar low exercise intensity butmany were notmyopic suggests that physical activity levels are not critical. There were differences in the activity patterns of lowmyopes during semester and holiday periods. This study highlights the need for a larger longitudinal-based study with particular emphasis on how discretionary time is spent.
Resumo:
In an effort to develop a fully computerized approach for structural synthesis of kinematic chains the steps involved in the method of structural synthesis based on transformation of binary chains [38] have been recast in a format suitable for implementation on a digital computer. The methodology thus evolved has been combined with the algebraic procedures for structural analysis [44] to develop a unified computer program for structural synthesis and analysis of simple jointed kinematic chains with a degree of freedom 0. Applications of this program are presented in the succeeding parts of the paper.
Resumo:
Maximum intensity contrast has been used as a measure of lens defocus. A photodiode array under the control of 8085 microprocessor is used to measure the maximum intensity contrast and to position the lens for best focus. The lens is moved by a stepper motor under processor control at a speed of 350 to 500 steps/s. At this speed, focusing time was found to be between 5 and 8 s. Under coherent illuminating conditions, an accuracy of ± 50 μm has been achieved.