998 resultados para data standardization
Resumo:
Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.
Resumo:
Trees are a great bank of data, named sometimes for this reason as the "silentwitnesses" of the past. Due to annual formation of rings, which is normally influenced directly by of climate parameters (generally changes in temperature and moisture or precipitation) and other environmental factors; these changes, occurred in the past, are"written" in the tree "archives" and can be "decoded" in order to interpret what hadhappened before, mainly applied for the past climate reconstruction.Using dendrochronological methods for obtaining samples of Pinus nigra fromthe Catalonian PrePirineous region, the cores of 15 trees with total time spine of about 100 - 250 years were analyzed for the tree ring width (TRW) patterns and had quite high correlation between them (0.71 ¿ 0.84), corresponding to a common behaviour for the environmental changes in their annual growth.After different trials with raw TRW data for standardization in order to take outthe negative exponential growth curve dependency, the best method of doubledetrending (power transformation and smoothing line of 32 years) were selected for obtaining the indexes for further analysis.Analyzing the cross-correlations between obtained tree ring width indexes andclimate data, significant correlations (p<0.05) were observed in some lags, as forexample, annual precipitation in lag -1 (previous year) had negative correlation with TRW growth in the Pallars region. Significant correlation coefficients are between 0.27- 0.51 (with positive or negative signs) for many cases; as for recent (but very short period) climate data of Seu d¿Urgell meteorological station, some significant correlation coefficients were observed, of the order of 0.9.These results confirm the hypothesis of using dendrochronological data as aclimate signal for further analysis, such as reconstruction of climate in the past orprediction in the future for the same locality.
Resumo:
This report describes the results of the research project investigating the use of advanced field data acquisition technologies for lowa transponation agencies. The objectives of the research project were to (1) research and evaluate current data acquisition technologies for field data collection, manipulation, and reporting; (2) identify the current field data collection approach and the interest level in applying current technologies within Iowa transportation agencies; and (3) summarize findings, prioritize technology needs, and provide recommendations regarding suitable applications for future development. A steering committee consisting oretate, city, and county transportation officials provided guidance during this project. Technologies considered in this study included (1) data storage (bar coding, radio frequency identification, touch buttons, magnetic stripes, and video logging); (2) data recognition (voice recognition and optical character recognition); (3) field referencing systems (global positioning systems [GPS] and geographic information systems [GIs]); (4) data transmission (radio frequency data communications and electronic data interchange); and (5) portable computers (pen-based computers). The literature review revealed that many of these technologies could have useful applications in the transponation industry. A survey was developed to explain current data collection methods and identify the interest in using advanced field data collection technologies. Surveys were sent out to county and city engineers and state representatives responsible for certain programs (e.g., maintenance management and construction management). Results showed that almost all field data are collected using manual approaches and are hand-carried to the office where they are either entered into a computer or manually stored. A lack of standardization was apparent for the type of software applications used by each agency--even the types of forms used to manually collect data differed by agency. Furthermore, interest in using advanced field data collection technologies depended upon the technology, program (e.g.. pavement or sign management), and agency type (e.g., state, city, or county). The state and larger cities and counties seemed to be interested in using several of the technologies, whereas smaller agencies appeared to have very little interest in using advanced techniques to capture data. A more thorough analysis of the survey results is provided in the report. Recommendations are made to enhance the use of advanced field data acquisition technologies in Iowa transportation agencies: (1) Appoint a statewide task group to coordinate the effort to automate field data collection and reporting within the Iowa transportation agencies. Subgroups representing the cities, counties, and state should be formed with oversight provided by the statewide task group. (2) Educate employees so that they become familiar with the various field data acquisition technologies.
Resumo:
This thesis work describes the creation of a pipework data structure for design system integration. Work is completed in pulp and paper plant delivery company with global engineering network operations in mind. User case of process design to 3D pipework design is introduced with influence of subcontracting engineering offices. Company data element list is gathered by using key person interviews and results are processed into a pipework data element list. Inter-company co-operation is completed in standardization association and common standard for pipework data elements is found. As result inter-company created pipework data element list is introduced. Further list usage, development and relations to design software vendors are evaluated.
Resumo:
This study aims at standardizing the pre-incubation and incubation pH and temperature used in the metachromatic staining method of myofibrillar ATPase activity of myosin (mATPase) used for asses and mules. Twenty four donkeys and 10 mules, seven females and three males, were used in the study. From each animal, fragments from the Gluteus medius muscle were collected and percutaneous muscle biopsy was performed using a 6.0-mm Bergström-type needle. In addition to the metachromatic staining method of mATPase, the technique of nicotinamide adenine dinucleotide tetrazolium reductase (NADH-TR) was also performed to confirm the histochemical data. The histochemical result of mATPase for acidic pre-incubation (pH=4.50) and alkaline incubation (pH=10.50), at a temperature of 37ºC, yielded the best differentiation of fibers stained with toluidine blue. Muscle fibers were identified according to the following colors: type I (oxidative, light blue), type IIA (oxidative-glycolytic, intermediate blue) and type IIX (glycolytic, dark blue). There are no reports in the literature regarding the characterization and distribution of different types of muscle fibers used by donkeys and mules when performing traction work, cargo transportation, endurance sports (horseback riding) and marching competitions. Therefore, this study is the first report on the standardization of the mATPase technique for donkeys and mules.
Resumo:
The objectives of the present study were 1) to compare results obtained by the traditional manual method of measuring heart rate (HR) and heart rate response (HRR) to the Valsalva maneuver, standing and deep breathing, with those obtained using a computerized data analysis system attached to a standard electrocardiograph machine; 2) to standardize the responses of healthy subjects to cardiovascular tests, and 3) to evaluate the response to these tests in a group of patients with diabetes mellitus (DM). In all subjects (97 healthy and 143 with DM) we evaluated HRR to deep breathing, HRR to standing, HRR to the Valsalva maneuver, and blood pressure response (BPR) to standing up and to a sustained handgrip. Since there was a strong positive correlation between the results obtained with the computerized method and the traditional method, we conclude that the new method can replace the traditional manual method for evaluating cardiovascular responses with the advantages of speed and objectivity. HRR and BPR of men and women did not differ. A correlation between age and HRR was observed for standing (r = -0.48, P<0.001) and deep breathing (r = -0.41, P<0.002). Abnormal BPR to standing was usually observed only in diabetic patients with definite and severe degrees of autonomic neuropathy.
Resumo:
Affiliation: Faculté de médecine, Université de Montréal & CANVAC
Resumo:
Abstract 1: Social Networks such as Twitter are often used for disseminating and collecting information during natural disasters. The potential for its use in Disaster Management has been acknowledged. However, more nuanced understanding of the communications that take place on social networks are required to more effectively integrate this information into the processes within disaster management. The type and value of information shared should be assessed, determining the benefits and issues, with credibility and reliability as known concerns. Mapping the tweets in relation to the modelled stages of a disaster can be a useful evaluation for determining the benefits/drawbacks of using data from social networks, such as Twitter, in disaster management.A thematic analysis of tweets’ content, language and tone during the UK Storms and Floods 2013/14 was conducted. Manual scripting was used to determine the official sequence of events, and classify the stages of the disaster into the phases of the Disaster Management Lifecycle, to produce a timeline. Twenty- five topics discussed on Twitter emerged, and three key types of tweets, based on the language and tone, were identified. The timeline represents the events of the disaster, according to the Met Office reports, classed into B. Faulkner’s Disaster Management Lifecycle framework. Context is provided when observing the analysed tweets against the timeline. This illustrates a potential basis and benefit for mapping tweets into the Disaster Management Lifecycle phases. Comparing the number of tweets submitted in each month with the timeline, suggests users tweet more as an event heightens and persists. Furthermore, users generally express greater emotion and urgency in their tweets.This paper concludes that the thematic analysis of content on social networks, such as Twitter, can be useful in gaining additional perspectives for disaster management. It demonstrates that mapping tweets into the phases of a Disaster Management Lifecycle model can have benefits in the recovery phase, not just in the response phase, to potentially improve future policies and activities. Abstract2: The current execution of privacy policies, as a mode of communicating information to users, is unsatisfactory. Social networking sites (SNS) exemplify this issue, attracting growing concerns regarding their use of personal data and its effect on user privacy. This demonstrates the need for more informative policies. However, SNS lack the incentives required to improve policies, which is exacerbated by the difficulties of creating a policy that is both concise and compliant. Standardization addresses many of these issues, providing benefits for users and SNS, although it is only possible if policies share attributes which can be standardized. This investigation used thematic analysis and cross- document structure theory, to assess the similarity of attributes between the privacy policies (as available in August 2014), of the six most frequently visited SNS globally. Using the Jaccard similarity coefficient, two types of attribute were measured; the clauses used by SNS and the coverage of forty recommendations made by the UK Information Commissioner’s Office. Analysis showed that whilst similarity in the clauses used was low, similarity in the recommendations covered was high, indicating that SNS use different clauses, but to convey similar information. The analysis also showed that low similarity in the clauses was largely due to differences in semantics, elaboration and functionality between SNS. Therefore, this paper proposes that the policies of SNS already share attributes, indicating the feasibility of standardization and five recommendations are made to begin facilitating this, based on the findings of the investigation.
Resumo:
Research question – The research question, that this study attempts to answer, is, what and why grocery retailers, which specifically work with the strategy of standardization, adapt in their marketing mix to the host market. Main adaptations are analyzed with regard to psychic distance in terms of consumer characteristics. Methodology – This study presents a qualitative research design. Secondary data, in-depth interviews and personal observations were used, in order to identify adaptations, which were conducted in a grocery retailer in Germany, which is its home market, and in Sweden, which is considered to be a host market. Findings – The main findings of this research indicate that grocery retailers that specifically work with the strategy of standardization, adopt their core strategy at the host market, in order to keep their economy of scale. However, the standardization strategy may cause negative financial results, which is why adaptations, in order to attract new customers, are required. Conclusively, a mix of both, the adaptation and standardization marketing strategy, has to be utilized.
Resumo:
Recently, two international standard organizations, ISO and OGC, have done the work of standardization for GIS. Current standardization work for providing interoperability among GIS DB focuses on the design of open interfaces. But, this work has not considered procedures and methods for designing river geospatial data. Eventually, river geospatial data has its own model. When we share the data by open interface among heterogeneous GIS DB, differences between models result in the loss of information. In this study a plan was suggested both to respond to these changes in the information envirnment and to provide a future Smart River-based river information service by understanding the current state of river geospatial data model, improving, redesigning the database. Therefore, primary and foreign key, which can distinguish attribute information and entity linkages, were redefined to increase the usability. Database construction of attribute information and entity relationship diagram have been newly redefined to redesign linkages among tables from the perspective of a river standard database. In addition, this study was undertaken to expand the current supplier-oriented operating system to a demand-oriented operating system by establishing an efficient management of river-related information and a utilization system, capable of adapting to the changes of a river management paradigm.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Absolute quantitation of clinical (1)H-MR spectra is virtually always incomplete for single subjects because the separate determination of spectrum, baseline, and transverse and longitudinal relaxation times in single subjects is prohibitively long. Integrated Processing and Acquisition of Data (IPAD) based on a combined 2-dimensional experimental and fitting strategy is suggested to substantially improve the information content from a given measurement time. A series of localized saturation-recovery spectra was recorded and combined with 2-dimensional prior-knowledge fitting to simultaneously determine metabolite T(1) (from analysis of the saturation-recovery time course), metabolite T(2) (from lineshape analysis based on metabolite and water peak shapes), macromolecular baseline (based on T(1) differences and analysis of the saturation-recovery time course), and metabolite concentrations (using prior knowledge fitting and conventional procedures of absolute standardization). The procedure was tested on metabolite solutions and applied in 25 subjects (15-78 years old). Metabolite content was comparable to previously found values. Interindividual variation was larger than intraindividual variation in repeated spectra for metabolite content as well as for some relaxation times. Relaxation times were different for various metabolite groups. Parts of the interindividual variation could be explained by significant age dependence of relaxation times.
Resumo:
OBJECTIVES Although the use of an adjudication committee (AC) for outcomes is recommended in randomized controlled trials, there are limited data on the process of adjudication. We therefore aimed to assess whether the reporting of the adjudication process in venous thromboembolism (VTE) trials meets existing quality standards and which characteristics of trials influence the use of an AC. STUDY DESIGN AND SETTING We systematically searched MEDLINE and the Cochrane Library from January 1, 2003, to June 1, 2012, for randomized controlled trials on VTE. We abstracted information about characteristics and quality of trials and reporting of adjudication processes. We used stepwise backward logistic regression model to identify trial characteristics independently associated with the use of an AC. RESULTS We included 161 trials. Of these, 68.9% (111 of 161) reported the use of an AC. Overall, 99.1% (110 of 111) of trials with an AC used independent or blinded ACs, 14.4% (16 of 111) reported how the adjudication decision was reached within the AC, and 4.5% (5 of 111) reported on whether the reliability of adjudication was assessed. In multivariate analyses, multicenter trials [odds ratio (OR), 8.6; 95% confidence interval (CI): 2.7, 27.8], use of a data safety-monitoring board (OR, 3.7; 95% CI: 1.2, 11.6), and VTE as the primary outcome (OR, 5.7; 95% CI: 1.7, 19.4) were associated with the use of an AC. Trials without random allocation concealment (OR, 0.3; 95% CI: 0.1, 0.8) and open-label trials (OR, 0.3; 95% CI: 0.1, 1.0) were less likely to report an AC. CONCLUSION Recommended processes of adjudication are underreported and lack standardization in VTE-related clinical trials. The use of an AC varies substantially by trial characteristics.