849 resultados para Integración of methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study investigates the production of alginate microcapsules, which have been coated with the polysaccharide chitosan, and evaluates some of their properties with the intention of improving the gastrointestinal viability of a probiotic (Bifidobacterium breve) by encapsulation in this system. The microcapsules were dried by a variety of methods, and the most suitable was chosen. The work described in this Article is the first report detailing the effects of drying on the properties of these microcapsules and the viability of the bacteria within relative to wet microcapsules. The pH range over which chitosan and alginate form polyelectrolyte complexes was explored by spectrophotometry, and this extended into swelling studies on the microcapsules over a range of pHs associated with the gastrointestinal tract. It was shown that chitosan stabilizes the alginate microcapsules at pHs above 3, extending the stability of the capsules under these conditions. The effect of chitosan exposure time on the coating thickness was investigated for the first time by confocal laser scanning microscopy, and its penetration into the alginate matrix was shown to be particularly slow. Coating with chitosan was found to increase the survival of B. breve in simulated gastric fluid as well as prolong its release upon exposure to intestinal pH.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article we describe recent progress on the design, analysis and implementation of hybrid numerical-asymptotic boundary integral methods for boundary value problems for the Helmholtz equation that model time harmonic acoustic wave scattering in domains exterior to impenetrable obstacles. These hybrid methods combine conventional piecewise polynomial approximations with high-frequency asymptotics to build basis functions suitable for representing the oscillatory solutions. They have the potential to solve scattering problems accurately in a computation time that is (almost) independent of frequency and this has been realized for many model problems. The design and analysis of this class of methods requires new results on the analysis and numerical analysis of highly oscillatory boundary integral operators and on the high-frequency asymptotics of scattering problems. The implementation requires the development of appropriate quadrature rules for highly oscillatory integrals. This article contains a historical account of the development of this currently very active field, a detailed account of recent progress and, in addition, a number of original research results on the design, analysis and implementation of these methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A number of methods of evaluating the validity of interval forecasts of financial data are analysed, and illustrated using intraday FTSE100 index futures returns. Some existing interval forecast evaluation techniques, such as the Markov chain approach of Christoffersen (1998), are shown to be inappropriate in the presence of periodic heteroscedasticity. Instead, we consider a regression-based test, and a modified version of Christoffersen's Markov chain test for independence, and analyse their properties when the financial time series exhibit periodic volatility. These approaches lead to different conclusions when interval forecasts of FTSE100 index futures returns generated by various GARCH(1,1) and periodic GARCH(1,1) models are evaluated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purification of intact enveloped virus particles can be useful as a first step in understanding the structure and function of both viral and host proteins that are incorporated into the virion. Purified preparations of virions can be used to address these questions using techniques such as mass spectrometry proteomics. Recent studies on the proteome of coronavirus virions have shown that in addition to the structural proteins, accessory and non-structural virus proteins and a wide variety of host cell proteins associate with virus particles. To further study the presence of virion proteins, high quality sample preparation is crucial to ensure reproducible analysis by the wide variety of methods available for proteomic analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Briefing phase interactions between clients and designers are recognized as social engagements, characterized by communicative sign use, where conceptual ideas are gradually transformed into potential design solutions. A semiotic analysis of briefing communications between client stakeholders and designers provides evidence of the significance and importance of stakeholder interpretation and understanding of design, empirical data being drawn from a qualitative study of NHS hospital construction projects in the UK. It is contended that stakeholders engage with a project through communicative signs and artefacts of design, referencing personal cognitive knowledge in acts of interpretation that may be different from those of designers and externally appointed client advisers. Such interpretations occur in addition to NHS client and design team efforts to ‘engage’ with and ‘understand’ stakeholders using a variety of methods. Social semiotic theorizations indicate how narrative strategies motivate the formulation of signs and artefacts in briefing work, the role of sign authors and sign readers being elucidated as a result. Findings are contextualized against current understandings of briefing communications and stakeholder management practices, a more socially attuned understanding of briefing countering some of the process-led improvement models that have characterized much of the post-Egan report literature. A stakeholder interpretation model is presented as one potential method to safeguard against unforeseen interpretations occurring, the model aligning with the proposal for a more measured recognition of how designs can trigger interpretations among client stakeholders.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ruminant husbandry is a major source of anthropogenic greenhouse gases (GHG). Filling knowledge gaps and providing expert recommendation are important for defining future research priorities, improving methodologies and establishing science-based GHG mitigation solutions to government and non-governmental organisations, advisory/extension networks, and the ruminant livestock sector. The objectives of this review is to summarize published literature to provide a detailed assessment of the methodologies currently in use for measuring enteric methane (CH4) emission from individual animals under specific conditions, and give recommendations regarding their application. The methods described include respiration chambers and enclosures, sulphur hexafluoride tracer (SF6) technique, and techniques based on short-term measurements of gas concentrations in samples of exhaled air. This includes automated head chambers (e.g. the GreenFeed system), the use of carbon dioxide (CO2) as a marker, and (handheld) laser CH4 detection. Each of the techniques are compared and assessed on their capability and limitations, followed by methodology recommendations. It is concluded that there is no ‘one size fits all’ method for measuring CH4 emission by individual animals. Ultimately, the decision as to which method to use should be based on the experimental objectives and resources available. However, the need for high throughput methodology e.g. for screening large numbers of animals for genomic studies, does not justify the use of methods that are inaccurate. All CH4 measurement techniques are subject to experimental variation and random errors. Many sources of variation must be considered when measuring CH4 concentration in exhaled air samples without a quantitative or at least regular collection rate, or use of a marker to indicate (or adjust) for the proportion of exhaled CH4 sampled. Consideration of the number and timing of measurements relative to diurnal patterns of CH4 emission and respiratory exchange are important, as well as consideration of feeding patterns and associated patterns of rumen fermentation rate and other aspects of animal behaviour. Regardless of the method chosen, appropriate calibrations and recovery tests are required for both method establishment and routine operation. Successful and correct use of methods requires careful attention to detail, rigour, and routine self-assessment of the quality of the data they provide.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Densities and viscosities of five vegetable oils (Babassu oil, Buriti oil, Brazil nut oil, macadamia oil, and grape seed oil) and of three blends of Buriti oil and soybean oil were measured as a function of temperature and correlated by empirical equations. The estimation capability of two types of predictive methodologies was tested using the measured data. The first group of methods was based on the fatty acid composition of the oils, while the other was based on their triacylglycerol composition, as a multicomponent system. In general, the six models tested presented a good representation of the physical properties considered in this work. A simple method of calculation is also proposed to predict the dynamic viscosity of methyl and ethyl ester biodiesels, based on the fatty acid composition of the original oil. Data presented in this work and the developed model can be valuable for designing processes and equipment for the edible oil industry and for biodiesel production.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We focus this work on the theoretical investigation of the block-copolymer poly [oxyoctyleneoxy-(2,6-dimethoxy-1,4phenylene-1,2-ethinylene-phenanthrene-2,4diyl) named as LaPPS19, recently proposed for optoelectronic applications. We used for that a variety of methods, from molecular mechanics to quantum semiempirical techniques (AMI, ZINDO/S-CIS). Our results show that as expected isolated LaPPS19 chains present relevant electron localization over the phenanthrene group. We found, however, that LaPPS19 could assemble in a pi-stacked form, leading to impressive interchain interaction; the stacking induces electronic delocalization between neighbor chains and introduces new states below the phenanthrene-related absorption; these results allowed us to associate the red-shift of the absorption edge, seen in the experimental results, to spontaneous pi-stack aggregation of the chains. (C) 2009 Wiley Periodicals, Inc. Int J Quantum Chem 110: 885-892, 2010

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A myriad of methods are available for virtual screening of small organic compound databases. In this study we have successfully applied a quantitative model of consensus measurements, using a combination of 3D similarity searches (ROCS and EON), Hologram Quantitative Structure Activity Relationships (HQSAR) and docking (FRED, FlexX, Glide and AutoDock Vina), to retrieve cruzain inhibitors from collected databases. All methods were assessed individually and then combined in a Ligand-Based Virtual Screening (LBVS) and Target-Based Virtual Screening (TBVS) consensus scoring, using Receiving Operating Characteristic (ROC) curves to evaluate their performance. Three consensus strategies were used: scaled-rank-by-number, rank-by-rank and rank-by-vote, with the most thriving the scaled-rank-by-number strategy, considering that the stiff ROC curve appeared to be satisfactory in every way to indicate a higher enrichment power at early retrieval of active compounds from the database. The ligand-based method provided access to a robust and predictive HQSAR model that was developed to show superior discrimination between active and inactive compounds, which was also better than ROCS and EON procedures. Overall, the integration of fast computational techniques based on ligand and target structures resulted in a more efficient retrieval of cruzain inhibitors with desired pharmacological profiles that may be useful to advance the discovery of new trypanocidal agents.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Woodworking industries still consists of wood dust problems. Young workers are especially vulnerable to safety risks. To reduce risks, it is important to change attitudes and increase knowledge about safety. Safety training have shown to establish positive attitudes towards safety among employees. The aim of current study is to analyze the effect of QR codes that link to Picture Mix EXposure (PIMEX) videos by analyzing attitudes to this safety training method and safety in student responses. Safety training videos were used in upper secondary school handicraft programs to demonstrate wood dust risks and methods to decrease exposure to wood dust. A preliminary study was conducted to investigate improvement of safety training in two schools in preparation for the main study that investigated a safety training method in three schools. In the preliminary study the PIMEX method was first used in which students were filmed while wood dust exposure was measured and subsequently displayed on a computer screen in real time. Before and after the filming, teachers, students, and researchers together analyzed wood dust risks and effective measures to reduce exposure to them. For the main study, QR codes linked to PIMEX videos were attached at wood processing machines. Subsequent interviews showed that this safety training method enables students in an early stage of their life to learn about risks and safety measures to control wood dust exposure. The new combination of methods can create awareness, change attitudes and motivation among students to work more frequently to reduce wood dust. 

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Phenolic compounds are one of the most important quality parameters of wines, since they contribute to wine organoleptic characteristics such as colour, astringency, and bitterness. Furthermore, several studies have pointed out that many show biological properties of interest, related to their antioxidant capacity. This antioxidant activity has been thoroughly studied and a wide variety of methods have been developed to evaluate it. In this study, the antioxidant activity of commercial Terras Madeirenses Portuguese wines (Madeira Island) was measured by three different analytical methods: [1,1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging assay, 2,2′-azino-bis-(3-ethylbenzothiazoline-6-sulfonic acid (ABTSradical dot+)) radical cation decolourisation, and ferric reducing/antioxidant power (FRAP) for the evaluation of reducing power (PR) and correlate them with the total phenolic content determined with the Folin–Ciocalteu’s reagent using gallic acid as a standard. The total polyphenol concentration was found to vary from 252 to 1936 mg/l gallic acid equivalents (GAE). The antiradical activity varied from 0.042 to 0.715 mM Trolox equivalents and the antioxidant capacity varied from 344 to 1105 mg/l gallic acid equivalents (GAE). For the reduction power we obtained 3.45–3.86 mM quercetin equivalents.