101 resultados para Using an harmonic instrument
Effects of orange juice formulation on prebiotic functionality using an in vitro colonic model sytem
Resumo:
A three-stage continuous fermentative colonic model system was used to monitor in vitro the effect of different orange juice formulations on prebiotic activity. Three different juices with and without Bimuno, a GOS mixture containing galactooligosaccharides (B-GOS) were assessed in terms of their ability to induce a bifidogenic microbiota. The recipe development was based on incorporating 2.75g B-GOS into a 250 ml serving of juice (65°Brix of concentrate juice). Alongside the production of B-GOS juice, a control juice - orange juice without any additional Bimuno and a positive control juice, containing all the components of Bimuno (glucose, galactose and lactose) in the same relative proportions with the exception of B-GOS were developed. Ion Exchange Chromotography analysis was used to test the maintenance of bimuno components after the production process. Data showed that sterilisation had no significant effect on concentration of B-GOS and simple sugars. The three juice formulations were digested under conditions resembling the gastric and small intestinal environments. Main bacterial groups of the faecal microbiota were evaluated throughout the colonic model study using 16S rRNA-based fluorescence in situ hybridization (FISH). Potential effects of supplementation of the juices on microbial metabolism were studied measuring short chain fatty acids (SCFAs) using gas chromatography. Furthermore, B-GOS juices showed positive modulations of the microbiota composition and metabolic activity. In particular, numbers of faecal bifidobacteria and lactobacilli were significantly higher when B-GOS juice was fermented compared to controls. Furthermore, fermentation of B-GOS juice resulted in an increase in Roseburia subcluster and concomitantly increased butyrate production, which is of potential benefit to the host. In conclusion, this study has shown B-GOS within orange juice can have a beneficial effect on the fecal microbiota.
Resumo:
Operational forecasting centres are currently developing data assimilation systems for coupled atmosphere-ocean models. Strongly coupled assimilation, in which a single assimilation system is applied to a coupled model, presents significant technical and scientific challenges. Hence weakly coupled assimilation systems are being developed as a first step, in which the coupled model is used to compare the current state estimate with observations, but corrections to the atmosphere and ocean initial conditions are then calculated independently. In this paper we provide a comprehensive description of the different coupled assimilation methodologies in the context of four dimensional variational assimilation (4D-Var) and use an idealised framework to assess the expected benefits of moving towards coupled data assimilation. We implement an incremental 4D-Var system within an idealised single column atmosphere-ocean model. The system has the capability to run both strongly and weakly coupled assimilations as well as uncoupled atmosphere or ocean only assimilations, thus allowing a systematic comparison of the different strategies for treating the coupled data assimilation problem. We present results from a series of identical twin experiments devised to investigate the behaviour and sensitivities of the different approaches. Overall, our study demonstrates the potential benefits that may be expected from coupled data assimilation. When compared to uncoupled initialisation, coupled assimilation is able to produce more balanced initial analysis fields, thus reducing initialisation shock and its impact on the subsequent forecast. Single observation experiments demonstrate how coupled assimilation systems are able to pass information between the atmosphere and ocean and therefore use near-surface data to greater effect. We show that much of this benefit may also be gained from a weakly coupled assimilation system, but that this can be sensitive to the parameters used in the assimilation.
Resumo:
The diurnal cycle of tropical convection and its relationship to the atmospheric tides is investigated using an aquaplanet GCM. The diurnal and semidiurnal harmonics of precipitation are both found to contribute significantly to the total diurnal variability of precipitation in the model, which is broadly consistent with observations of the diurnal cycle of convection over the open ocean. The semidiurnal tide is found to be the dominant forcing for the semidiurnal harmonic of precipitation. In contrast the diurnal tide plays only a small role in forcing the diurnal harmonic of precipitation, which is dominated by the variations in shortwave and longwave heating. In both the diurnal and semidiurnal harmonics, the feedback onto the convection by the humidity tendencies due to the convection is found to be important in determining the phase of the harmonics. Further experiments show that the diurnal cycle of precipitation is sensitive to the choice of closure in the convection scheme. While the surface pressure signal of the simulated atmospheric tides in the model agree well with both theory and observations in their magnitude and phase, sensitivity experiments suggest that the role of the stratospheric ozone in forcing the semidiurnal tide is much reduced compared to theoretical predictions. Furthermore, the influence of the cloud radiative effects seems small. It is suggested that the radiative heating profile in the troposphere, associated primarily with the water vapor distribution, is more important than previously thought for driving the semidiurnal tide. However, this result may be sensitive to the vertical resolution and extent of the model.
Resumo:
There is great interest in using amplified fragment length polymorphism (AFLP) markers because they are inexpensive and easy to produce. It is, therefore, possible to generate a large number of markers that have a wide coverage of species genotnes. Several statistical methods have been proposed to study the genetic structure using AFLP's but they assume Hardy-Weinberg equilibrium and do not estimate the inbreeding coefficient, F-IS. A Bayesian method has been proposed by Holsinger and colleagues that relaxes these simplifying assumptions but we have identified two sources of bias that can influence estimates based on these markers: (i) the use of a uniform prior on ancestral allele frequencies and (ii) the ascertainment bias of AFLP markers. We present a new Bayesian method that avoids these biases by using an implementation based on the approximate Bayesian computation (ABC) algorithm. This new method estimates population-specific F-IS and F-ST values and offers users the possibility of taking into account the criteria for selecting the markers that are used in the analyses. The software is available at our web site (http://www-leca.uif-grenoble.fi-/logiciels.htm). Finally, we provide advice on how to avoid the effects of ascertainment bias.
Resumo:
‘Instructions for an Audio Performance/Scissors Recording’ is a score for a performance using an amplified scissors that can be downloaded and interpreted by a performer anywhere in the world. The file functions as a manual for the performer with guidance for creating the instrument and preparing the performance space while it also provides a template for the actual sonic pattern to be followed in the live performance. Created originally for ‘Storageroom’ an online platform featuring complete exhibitions that are available for download, the piece was exhibited and interpreted in a live performance by Ayelet Lerman for the File Transfer Protocol, the contemporary section of the Haifa-Jerusalem-Tel Aviv exhibition at the Museum of Art, Haifa, Israel.
Resumo:
Previously, using an in vitro static batch culture system, it was found that rice bran (RB), inulin, fibersol, mannanoligosaccharides (MOS), larch arabinogalactan and citrus pectin elicited prebiotic effects (in terms of increased numbers of bifidobacteria and lactic acid bacteria) on the faecal microbiota of a dog. The aim of the present study was to confirm the prebiotic potential of each individual substrate using multiple faecal donors, as well as assessing the prebiotic potential of 15 substrate blends made from them. Anaerobic static and stirred, pH-controlled batch culture systems inoculated with faecal samples from healthy dogs were used for this purpose. Fluorescence in situ hybridization (FISH) analysis using seven oligonucleotide probes targeting selected bacterial groups and DAPI (total bacteria) was used to monitor bacterial populations during fermentation runs. High-performance liquid chromatography was used to measure butyrate produced as a result of bacterial fermentation of the substrates. RB and a MOS/RB blend (1:1, w/w) were shown to elicit prebiotic and butyrogenic effects on the canine microbiota in static batch culture fermentations. Further testing of these substrates in stirred, pH-controlled batch culture fermentation systems confirmed the prebiotic and butyrogenic effects of MOS/RB, with no enhancement of Clostridium clusters I and II and Escherichia coli populations.
Resumo:
Time series of global and regional mean Surface Air Temperature (SAT) anomalies are a common metric used to estimate recent climate change. Various techniques can be used to create these time series from meteorological station data. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques relative to the reanalysis reference. Kriging techniques provided the smallest errors in estimates of Arctic anomalies and Simple Kriging was often the best kriging method in this study, especially over sea ice. A linear interpolation technique had, on average, Root Mean Square Errors (RMSEs) up to 0.55 K larger than the two kriging techniques tested. Non-interpolating techniques provided the least representative anomaly estimates. Nonetheless, they serve as useful checks for confirming whether estimates from interpolating techniques are reasonable. The interaction of meteorological station coverage with estimation techniques between 1850 and 2011 was simulated using an ensemble dataset comprising repeated individual years (1979-2011). All techniques were found to have larger RMSEs for earlier station coverages. This supports calls for increased data sharing and data rescue, especially in sparsely observed regions such as the Arctic.
Resumo:
Inverse methods are widely used in various fields of atmospheric science. However, such methods are not commonly used within the boundary-layer community, where robust observations of surface fluxes are a particular concern. We present a new technique for deriving surface sensible heat fluxes from boundary-layer turbulence observations using an inverse method. Doppler lidar observations of vertical velocity variance are combined with two well-known mixed-layer scaling forward models for a convective boundary layer (CBL). The inverse method is validated using large-eddy simulations of a CBL with increasing wind speed. The majority of the estimated heat fluxes agree within error with the proscribed heat flux, across all wind speeds tested. The method is then applied to Doppler lidar data from the Chilbolton Observatory, UK. Heat fluxes are compared with those from a mast-mounted sonic anemometer. Errors in estimated heat fluxes are on average 18 %, an improvement on previous techniques. However, a significant negative bias is observed (on average −63%) that is more pronounced in the morning. Results are improved for the fully-developed CBL later in the day, which suggests that the bias is largely related to the choice of forward model, which is kept deliberately simple for this study. Overall, the inverse method provided reasonable flux estimates for the simple case of a CBL. Results shown here demonstrate that this method has promise in utilizing ground-based remote sensing to derive surface fluxes. Extension of the method is relatively straight-forward, and could include more complex forward models, or other measurements.
Resumo:
An improved algorithm for the generation of gridded window brightness temperatures is presented. The primary data source is the International Satellite Cloud Climatology Project, level B3 data, covering the period from July 1983 to the present. The algorithm rakes window brightness, temperatures from multiple satellites, both geostationary and polar orbiting, which have already been navigated and normalized radiometrically to the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer, and generates 3-hourly global images on a 0.5 degrees by 0.5 degrees latitude-longitude grid. The gridding uses a hierarchical scheme based on spherical kernel estimators. As part of the gridding procedure, the geostationary data are corrected for limb effects using a simple empirical correction to the radiances, from which the corrected temperatures are computed. This is in addition to the application of satellite zenith angle weighting to downweight limb pixels in preference to nearer-nadir pixels. The polar orbiter data are windowed on the target time with temporal weighting to account for the noncontemporaneous nature of the data. Large regions of missing data are interpolated from adjacent processed images using a form of motion compensated interpolation based on the estimation of motion vectors using an hierarchical block matching scheme. Examples are shown of the various stages in the process. Also shown are examples of the usefulness of this type of data in GCM validation.
Resumo:
Data from four recent reanalysis projects [ECMWF, NCEP-NCAR, NCEP - Department of Energy ( DOE), NASA] have been diagnosed at the scale of synoptic weather systems using an objective feature tracking method. The tracking statistics indicate that, overall, the reanalyses correspond very well in the Northern Hemisphere (NH) lower troposphere, although differences for the spatial distribution of mean intensities show that the ECMWF reanalysis is systematically stronger in the main storm track regions but weaker around major orographic features. A direct comparison of the track ensembles indicates a number of systems with a broad range of intensities that compare well among the reanalyses. In addition, a number of small-scale weak systems are found that have no correspondence among the reanalyses or that only correspond upon relaxing the matching criteria, indicating possible differences in location and/or temporal coherence. These are distributed throughout the storm tracks, particularly in the regions known for small-scale activity, such as secondary development regions and the Mediterranean. For the Southern Hemisphere (SH), agreement is found to be generally less consistent in the lower troposphere with significant differences in both track density and mean intensity. The systems that correspond between the various reanalyses are considerably reduced and those that do not match span a broad range of storm intensities. Relaxing the matching criteria indicates that there is a larger degree of uncertainty in both the location of systems and their intensities compared with the NH. At upper-tropospheric levels, significant differences in the level of activity occur between the ECMWF reanalysis and the other reanalyses in both the NH and SH winters. This occurs due to a lack of coherence in the apparent propagation of the systems in ERA15 and appears most acute above 500 hPa. This is probably due to the use of optimal interpolation data assimilation in ERA15. Also shown are results based on using the same techniques to diagnose the tropical easterly wave activity. Results indicate that the wave activity is sensitive not only to the resolution and assimilation methods used but also to the model formulation.
Resumo:
Competency management is a very important part of a well-functioning organisation. Unfortunately competency descriptions are not uniformly specified nor defined across borders: National, sectorial or organisational, leading to an opaque competency description market with a multitude of competency frameworks and competency benchmarks. An ontology is a formalised description of a domain, which enables automated reasoning engines to be built which by utilising the interrelations between entities can make “intelligent” choices in different situations within the domain. Introducing formalised competency ontologies automated tools, such as skill gap analysis, training suggestion generation, job search and recruitment, can be developed, which compare and contrast different competency descriptions on the semantic level. The major problem with defining a common formalised ontology for competencies is that there are so many viewpoints of competencies and competency frameworks. Work within the TRACE project has focused on finding common trends within different competency frameworks in order to allow an intermediate competency description to be made, which other frameworks can reference. This research has shown that competencies can be divided up into “knowledge”, “skills” and what we call “others”. An ontology has been created based on this with a simple structure of different “kinds” of “knowledges” and “skills” using semantic interrelations to define the basic semantic structure of the ontology. A prototype tool for analysing a skill gap analysis has been developed. Personal profiles can be produced using the tool and a skill gap analysis is performed on a desired competency profile by using an ontologically based inference engine, which is able to list closest fit and possible proficiency gaps
Resumo:
Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X data to detect flooded regions in urban areas is described. An important application for this would be the calibration and validation of the flood extent predicted by an urban flood inundation model. To date, research on such models has been hampered by lack of suitable distributed validation data. The study uses a 3m resolution TerraSAR-X image of a 1-in-150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SETES SAR simulator was used in conjunction with airborne LiDAR data to estimate regions of the TerraSAR-X image in which water would not be visible due to radar shadow or layover caused by buildings and taller vegetation, and these regions were masked out in the flood detection process. A semi-automatic algorithm for the detection of floodwater was developed, based on a hybrid approach. Flooding in rural areas adjacent to the urban areas was detected using an active contour model (snake) region-growing algorithm seeded using the un-flooded river channel network, which was applied to the TerraSAR-X image fused with the LiDAR DTM to ensure the smooth variation of heights along the reach. A simpler region-growing approach was used in the urban areas, which was initialized using knowledge of the flood waterline in the rural areas. Seed pixels having low backscatter were identified in the urban areas using supervised classification based on training areas for water taken from the rural flood, and non-water taken from the higher urban areas. Seed pixels were required to have heights less than a spatially-varying height threshold determined from nearby rural waterline heights. Seed pixels were clustered into urban flood regions based on their close proximity, rather than requiring that all pixels in the region should have low backscatter. This approach was taken because it appeared that urban water backscatter values were corrupted in some pixels, perhaps due to contributions from side-lobes of strong reflectors nearby. The TerraSAR-X urban flood extent was validated using the flood extent visible in the aerial photos. It turned out that 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. These findings indicate that TerraSAR-X is capable of providing useful data for the calibration and validation of urban flood inundation models.
Resumo:
We investigated diurnal nitrate (NO3-) concentration variability in the San Joaquin River using an in situ optical NO3- sensor and discrete sampling during a 5-day summer period characterized by high algal productivity. Dual NO3- isotopes (delta N-15(NO3) and delta O-18(NO3)) and dissolved oxygen isotopes (delta O-18(DO)) were measured over 2 days to assess NO3- sources and biogeochemical controls over diurnal time-scales. Concerted temporal patterns of dissolved oxygen (DO) concentrations and delta O-18(DO) were consistent with photosynthesis, respiration and atmospheric O-2 exchange, providing evidence of diurnal biological processes independent of river discharge. Surface water NO3- concentrations varied by up to 22% over a single diurnal cycle and up to 31% over the 5-day study, but did not reveal concerted diurnal patterns at a frequency comparable to DO concentrations. The decoupling of delta N-15(NO3) and delta O-18(NO3) isotopes suggests that algal assimilation and denitrification are not major processes controlling diurnal NO3- variability in the San Joaquin River during the study. The lack of a clear explanation for NO3- variability likely reflects a combination of riverine biological processes and time-varying physical transport of NO3- from upstream agricultural drains to the mainstem San Joaquin River. The application of an in situ optical NO3- sensor along with discrete samples provides a view into the fine temporal structure of hydrochemical data and may allow for greater accuracy in pollution assessment.
Resumo:
A method for in situ detection of atmospheric turbulence has been developed using an inexpensive sensor carried within a conventional meteorological radiosonde. The sensor-a Hall effect magnetometer-was used to monitor the terrestrial magnetic field. Rapid time scale (10 s or less) fluctuations in the magnetic field measurement were related to the motion of the radiosonde, which was strongly influenced by atmospheric turbulence. Comparison with cloud radar measurements showed turbulence in regions where rapid time-scale magnetic fluctuations occurred. Reliable measurements were obtained between the surface and the stratosphere.