56 resultados para Algorithms, Properties, the KCube Graphs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a flexible framework to calculate the optical properties of atmospheric aerosols at a given relative humidity based on their composition and size distribution. The similarity of this framework to climate model parameterisations allows rapid and extensive sensitivity tests of the impact of uncertainties in data or of new measurements on climate relevant aerosol properties. The data collected by the FAAM BAe-146 aircraft during the EUCAARI-LONGREX and VOCALS-REx campaigns have been used in a closure study to analyse the agreement between calculated and measured aerosol optical properties for two very different aerosol types. The agreement achieved for the EUCAARI-LONGREX flights is within the measurement uncertainties for both scattering and absorption. However, there is poor agreement between the calculated and the measured scattering for the VOCALS-REx flights. The high concentration of sulphate, which is a scattering aerosol with no absorption in the visible spectrum, made the absorption measurements during VOCALS-REx unreliable, and thus no closure study was possible for the absorption. The calculated hygroscopic scattering growth factor overestimates the measured values during EUCAARI-LONGREX and VOCALS-REx by ∼30% and ∼20%, respectively. We have also tested the sensitivity of the calculated aerosol optical properties to the uncertainties in the refractive indices, the hygroscopic growth factors and the aerosol size distribution. The largest source of uncertainty in the calculated scattering is the aerosol size distribution (∼35%), followed by the assumed hygroscopic growth factor for organic aerosol (∼15%), while the predominant source of uncertainty in the calculated absorption is the refractive index of organic aerosol (28–60%), although we would expect the refractive index of black carbon to be important for aerosol with a higher black carbon fraction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual exploration of scientific data in life science area is a growing research field due to the large amount of available data. The Kohonen’s Self Organizing Map (SOM) is a widely used tool for visualization of multidimensional data. In this paper we present a fast learning algorithm for SOMs that uses a simulated annealing method to adapt the learning parameters. The algorithm has been adopted in a data analysis framework for the generation of similarity maps. Such maps provide an effective tool for the visual exploration of large and multi-dimensional input spaces. The approach has been applied to data generated during the High Throughput Screening of molecular compounds; the generated maps allow a visual exploration of molecules with similar topological properties. The experimental analysis on real world data from the National Cancer Institute shows the speed up of the proposed SOM training process in comparison to a traditional approach. The resulting visual landscape groups molecules with similar chemical properties in densely connected regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This note corrects a previous treatment of algorithms for the metric DTR, Depth by the Rule.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) is a World Weather Research Programme project. One of its main objectives is to enhance collaboration on the development of ensemble prediction between operational centers and universities by increasing the availability of ensemble prediction system (EPS) data for research. This study analyzes the prediction of Northern Hemisphere extratropical cyclones by nine different EPSs archived as part of the TIGGE project for the 6-month time period of 1 February 2008–31 July 2008, which included a sample of 774 cyclones. An objective feature tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast verification statistics have then been produced [using the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis as the truth] for cyclone position, intensity, and propagation speed, showing large differences between the different EPSs. The results show that the ECMWF ensemble mean and control have the highest level of skill for all cyclone properties. The Japanese Meteorological Administration (JMA), the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC) have 1 day less skill for the position of cyclones throughout the forecast range. The relative performance of the different EPSs remains the same for cyclone intensity except for NCEP, which has larger errors than for position. NCEP, the Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), and the Australian Bureau of Meteorology (BoM) all have faster intensity error growth in the earlier part of the forecast. They are also very underdispersive and significantly underpredict intensities, perhaps due to the comparatively low spatial resolutions of these EPSs not being able to accurately model the tilted structure essential to cyclone growth and decay. There is very little difference between the levels of skill of the ensemble mean and control for cyclone position, but the ensemble mean provides an advantage over the control for all EPSs except CPTEC in cyclone intensity and there is an advantage for propagation speed for all EPSs. ECMWF and JMA have an excellent spread–skill relationship for cyclone position. The EPSs are all much more underdispersive for cyclone intensity and propagation speed than for position, with ECMWF and CMC performing best for intensity and CMC performing best for propagation speed. ECMWF is the only EPS to consistently overpredict cyclone intensity, although the bias is small. BoM, NCEP, UKMO, and CPTEC significantly underpredict intensity and, interestingly, all the EPSs underpredict the propagation speed, that is, the cyclones move too slowly on average in all EPSs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The theory of harmonic force constant refinement calculations is reviewed, and a general-purpose program for force constant and normal coordinate calculations is described. The program, called ASYM20. is available through Quantum Chemistry Program Exchange. It will work on molecules of any symmetry containing up to 20 atoms and will produce results on a series of isotopomers as desired. The vibrational secular equations are solved in either nonredundant valence internal coordinates or symmetry coordinates. As well as calculating the (harmonic) vibrational wavenumbers and normal coordinates, the program will calculate centrifugal distortion constants, Coriolis zeta constants, harmonic contributions to the α′s. root-mean-square amplitudes of vibration, and other quantities related to gas electron-diffraction studies and thermodynamic properties. The program will work in either a predict mode, in which it calculates results from an input force field, or in a refine mode, in which it refines an input force field by least squares to fit observed data on the quantities mentioned above. Predicate values of the force constants may be included in the data set for a least-squares refinement. The program is written in FORTRAN for use on a PC or a mainframe computer. Operation is mainly controlled by steering indices in the input data file, but some interactive control is also implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surfactin is a bacterial lipopeptide produced by Bacillus subtilis and is a powerful surfactant, having also antiviral, antibacterial and antitumor properties. The recovery and purification of surfactin from complex fermentation broths is a major obstacle to its commercialization; therefore, a two-step membrane filtration process was developed using a lab scale tangential flow filtration (TFF) unit with 10 kDa MWCO regenerated cellulose (RC) and polyethersulfone (PES)membranes at three different transmembrane pressure (TMP) of 1.5 bar, 2.0 bar and 2.5 bar. Two modes of filtrations were studied, with and without cleaning of membranes prior to UF-2. In a first step of ultrafiltration (UF-1), surfactin was retained effectively by membranes at above its critical micelle concentration (CMC); subsequently in UF-2, the retentate micelles were disrupted by addition of 50% (v/v) methanol solution to allow recovery of surfactin in the permeate. Main protein contaminants were effectively retained by the membrane in UF-2. Flux of permeates, rejection coefficient (R) of surfactin and proteinwere measured during the filtrations. Overall the three different TMPs applied have no significant effect in the filtrations and PES is the more suitable membrane to selectively separate surfactin from fermentation broth, achieving high recovery and level of purity. In addition this two-step UF process is scalable for larger volume of samples without affecting the original functionality of surfactin, although membranes permeability can be affected due to exposure to methanolic solution used in UF-2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The best of both worlds: The synthesis of carbon-encapsulated iron-based magnetic nanoparticles is described. With such small catalysts that have macroscopic magnetic properties, the advantages of homogeneous or colloidal and heterogeneous catalysts can be combined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tofu gels were rheologically examined to determine their storage or elastic (G′) and loss or viscous (G″) moduli as a function of frequency within their linear viscoelastic limits. The tofu gels were made using either glucono-δ-lactone (GDL) or calcium sulphate (CaSO4·2H2O), followed by either heat treatment (heated soymilk at 97 °C prior to coagulation and subsequently held at 70 °C for 60 min, HT) or high pressure treatment (400 MPa at 20 °C for 10 min, HP). The overall moduli values of the GDL gels and CaSO4·2H2O gels of both physical treatments were similar, each gave frequency profiles expected for weak viscoelastic materials. However, although both temperature and high pressure treatments could be used to produce tofu gels, the final products were not the same. Pressure formed gels, despite having a higher overall “consistency” (increasing values of their moduli), had a proportionately higher contribution from the loss modulus (increased tan δ). Differences could also be observed using confocal scanning laser microscopy. While such treatment may give rise to differing systems/structures, with new or modified organoleptic properties, the more “open” structures obtained by pressure treatment may well cause processing difficulties if subsequent reworking or moulding is required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tofu gels were rheologically examined to determine their storage or elastic (G') and loss or viscous (G '') moduli as a function of frequency within their linear viscoelastic limits. The tofu gels were made using either glucono-delta-lactone (GDL) or calcium sulphate (CaSO4 center dot 2H(2)O), followed by either heat treatment (heated soymilk at >= 97 degrees C prior to coagulation and subsequently held at 70 degrees C for 60 min, HT) or high pressure treatment (400 MPa at 20 degrees C for 10 min, HP). The overall moduli values of the GDL gels and CaSO4 center dot 2H(2)O gels of both physical treatments were similar, each gave frequency profiles expected for weak viscoelastic materials. However, although both temperature and high pressure treatments could be used to produce tofu gels, the final products were not the same. Pressure formed gels, despite having a higher overall "consistency" (increasing values of their moduli), had a proportionately higher contribution from the loss modulus (increased tan delta). Differences could also be observed using confocal scanning laser microscopy. While such treatment may give rise to differing systems/structures, with new or modified organoleptic properties, the more "open" structures obtained by pressure treatment may well cause processing difficulties if subsequent reworking or moulding is required. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this review is to illustrate how physical properties are important to food processing and quality. Three food products, flakes, porridge and bread, in addition to oat groats are used to show the influence of water and heat-treatments on the mechanical properties. The hydrothermal history of ingredients is shown to affect product quality. Water acts as a plasticiser and solvent in these foods, whilst heat modifies the conformation and interactions of macromolecular components. Structure as well as chemical composition is shown to govern texture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An atoxigenic strain of Penicillium camemberti was superficially inoculated on fermented sausages in an attempt to improve their sensory properties. The growth of this mould on the surface of the sausages resulted in an intense proteolysis and lipolysis, which caused an increase in the concentration of free amino acids, free fatty acids (FFA) and volatile compounds. Many of these were derived from amino acid catabolism and were responsible for the "ripened flavour", i.e. branched aldehydes and the corresponding alcohols, acids and esters. The development of the fungal mycelia on the surface of the sausages also protected lipids from oxidation, resulting in both lower 2-thiobarbituric acid (TBARS) values and lipid oxidation-derived compounds, such as aliphatic aldehydes and alcohols. The sensory analysis of superficially inoculated sausages showed clear improvements in odour and flavour and, as a consequence, in the overall quality of the sausages. Therefore, this strain is proposed as a potential starter culture for dry fermented sausage production. (C) 2002 Elsevier Science B.V All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extensive set of machine learning and pattern classification techniques trained and tested on KDD dataset failed in detecting most of the user-to-root attacks. This paper aims to provide an approach for mitigating negative aspects of the mentioned dataset, which led to low detection rates. Genetic algorithm is employed to implement rules for detecting various types of attacks. Rules are formed of the features of the dataset identified as the most important ones for each attack type. In this way we introduce high level of generality and thus achieve high detection rates, but also gain high reduction of the system training time. Thenceforth we re-check the decision of the user-to- root rules with the rules that detect other types of attacks. In this way we decrease the false-positive rate. The model was verified on KDD 99, demonstrating higher detection rates than those reported by the state- of-the-art while maintaining low false-positive rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a fuzzy Markov random field (FMRF) model is used to segment land-objects into free, grass, building, and road regions by fusing remotely, sensed LIDAR data and co-registered color bands, i.e. scanned aerial color (RGB) photo and near infra-red (NIR) photo. An FMRF model is defined as a Markov random field (MRF) model in a fuzzy domain. Three optimization algorithms in the FMRF model, i.e. Lagrange multiplier (LM), iterated conditional mode (ICM), and simulated annealing (SA), are compared with respect to the computational cost and segmentation accuracy. The results have shown that the FMRF model-based ICM algorithm balances the computational cost and segmentation accuracy in land-cover segmentation from LIDAR data and co-registered bands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important goal in computational neuroanatomy is the complete and accurate simulation of neuronal morphology. We are developing computational tools to model three-dimensional dendritic structures based on sets of stochastic rules. This paper reports an extensive, quantitative anatomical characterization of simulated motoneurons and Purkinje cells. We used several local and global algorithms implemented in the L-Neuron and ArborVitae programs to generate sets of virtual neurons. Parameters statistics for all algorithms were measured from experimental data, thus providing a compact and consistent description of these morphological classes. We compared the emergent anatomical features of each group of virtual neurons with those of the experimental database in order to gain insights on the plausibility of the model assumptions, potential improvements to the algorithms, and non-trivial relations among morphological parameters. Algorithms mainly based on local constraints (e.g., branch diameter) were successful in reproducing many morphological properties of both motoneurons and Purkinje cells (e.g. total length, asymmetry, number of bifurcations). The addition of global constraints (e.g., trophic factors) improved the angle-dependent emergent characteristics (average Euclidean distance from the soma to the dendritic terminations, dendritic spread). Virtual neurons systematically displayed greater anatomical variability than real cells, suggesting the need for additional constraints in the models. For several emergent anatomical properties, a specific algorithm reproduced the experimental statistics better than the others did. However, relative performances were often reversed for different anatomical properties and/or morphological classes. Thus, combining the strengths of alternative generative models could lead to comprehensive algorithms for the complete and accurate simulation of dendritic morphology.