907 resultados para distributed combination of classifiers
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform data mining and other analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data that is used to populate the second component, and a data warehouse that contains important molecular properties. These properties may be used for data mining studies. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular, we look at two aspects: firstly, how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories — this is an important and challenging aspect of P-found, due to the large data volumes involved and the desire of scientists to maintain control of their own data. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling scientific discovery.
Resumo:
A novel combination of site-specific isotope labelling, polarised infrared spectroscopy and molecular combing reveal local orientational ordering in the fibril-forming peptide YTIAALLSPYSGGRADS. Use of 13C-18O labelled alanine residues demonstrates that the Nterminal end of the peptide is incorporated into the cross-beta structure, while the C-terminal end shows orientational disorder
Resumo:
An alternating hexameric water (H2O)(6) cluster and a chlorine-water cluster [Cl-2(H2O)(4)](2-) in the chair forms combine axially to each other to form a 1D chain [{Cl-2(H2O)(6)}(2-)](n) in complex [FeL2]Cl center dot(H2O)(3) (L=2-[(2-methylaminoethylimino)-methyl]-phenol)]. The water molecules display extensive H-bonding interactions with monomeric iron-organic units to form a hydrogen-bonded 2D supramolecular assembly.
Resumo:
The Mobile Network Optimization (MNO) technologies have advanced at a tremendous pace in recent years. And the Dynamic Network Optimization (DNO) concept emerged years ago, aimed to continuously optimize the network in response to variations in network traffic and conditions. Yet, DNO development is still at its infancy, mainly hindered by a significant bottleneck of the lengthy optimization runtime. This paper identifies parallelism in greedy MNO algorithms and presents an advanced distributed parallel solution. The solution is designed, implemented and applied to real-life projects whose results yield a significant, highly scalable and nearly linear speedup up to 6.9 and 14.5 on distributed 8-core and 16-core systems respectively. Meanwhile, optimization outputs exhibit self-consistency and high precision compared to their sequential counterpart. This is a milestone in realizing the DNO. Further, the techniques may be applied to similar greedy optimization algorithm based applications.
Resumo:
It has been years since the introduction of the Dynamic Network Optimization (DNO) concept, yet the DNO development is still at its infant stage, largely due to a lack of breakthrough in minimizing the lengthy optimization runtime. Our previous work, a distributed parallel solution, has achieved a significant speed gain. To cater for the increased optimization complexity pressed by the uptake of smartphones and tablets, however, this paper examines the potential areas for further improvement and presents a novel asynchronous distributed parallel design that minimizes the inter-process communications. The new approach is implemented and applied to real-life projects whose results demonstrate an augmented acceleration of 7.5 times on a 16-core distributed system compared to 6.1 of our previous solution. Moreover, there is no degradation in the optimization outcome. This is a solid sprint towards the realization of DNO.
Resumo:
In this paper, we develop a novel constrained recursive least squares algorithm for adaptively combining a set of given multiple models. With data available in an online fashion, the linear combination coefficients of submodels are adapted via the proposed algorithm.We propose to minimize the mean square error with a forgetting factor, and apply the sum to one constraint to the combination parameters. Moreover an l1-norm constraint to the combination parameters is also applied with the aim to achieve sparsity of multiple models so that only a subset of models may be selected into the final model. Then a weighted l2-norm is applied as an approximation to the l1-norm term. As such at each time step, a closed solution of the model combination parameters is available. The contribution of this paper is to derive the proposed constrained recursive least squares algorithm that is computational efficient by exploiting matrix theory. The effectiveness of the approach has been demonstrated using both simulated and real time series examples.
Resumo:
Five new species of Paepalanthus section Diphyomene are described and illustrated: P. brevis, P. flexuosus, P. longiciliatus, P. macer, and P. stellatus. Paepalanthus brevis, similar to P. decussus, is easily distinguished by its short reproductive axis, and pilose and mucronate leaves. Paepalanthus flexuosus, morphologically related to P. urbanianus, possesses a distinctive short and tortuous reproductive axis. Paepalanthus longiciliatus, morphologically similar to P. weddellianus, possesses long trichomes on the margins of the reproductive axis bracts, considered a diagnostic feature. Paepalanthus macer shares similarities with P. amoenus, differing by its sulfurous capitula and adpressed reproductive axis bracts. Paepalanthus stellatus also has affinity with P. decussus, but possesses unique, membranaceous, reproductive-axis bracts and a punctual inner-capitulum arrangement of pistillate flowers. Four of the described species are narrowly distributed in the state of Goias, whereas P. brevis is endemic to Distrito Federal. All are considered critically endangered. Detailed comparisons of these species are presented in tables. Comments on phenology, distribution, habitat and etymology, along with an identification key, are provided.
Resumo:
Photodynamic therapy, used mainly for cancer treatment and microorganisms inaction, is based on production of reactive oxygen species by light irradiation of a sensitizer. Hematoporphyrin derivatives as Photofrin (R) (PF) Photogem (R) (PG) and Photosan (R) (PF), and chlorin-c6-derivatives as Photodithazine (R)(PZ), have suitable sensitizing properties. The present study provides a way to make a fast previous evaluation of photosensitizers efficacy by a combination of techniques: a) use of brovine serum albumin and uric acid as chemical dosimeters; b) photo-hemolysis of red blood cells used as a cell membrane interaction model, and c) octanol/phosphate buffer partition to assess the relative lipophilicity of the compounds. The results suggest the photodynamic efficient rankings PZ > PG >= PF > PS. These results agree with the cytotoxicity of the photosensitizers as well as to chromatographic separation of the HpDs, both performed in our group, showing that the more lipophilic is the dye, the more acute is the damage to the RBC membrane and the oxidation of indol, which is immersed in the hydrophobic region of albumin.
Resumo:
The fragmentation mechanisms of singlet oxygen [O(2) ((1)Delta(g))]-derived oxidation products of tryptophan (W) were analyzed using collision-induced dissociation coupled with (18)O-isotopic labeling experiments and accurate mass measurements. The five identified oxidized products, namely two isomeric alcohols (trans and cis WOH), two isomeric hydroperoxides (trans and cis WOOH), and N-formylkynurenine (FMK), were shown to share some common fragment ions and losses of small neutral molecules. Conversely, each oxidation product has its own fragmentation mechanism and intermediates, which were confirmed by (18)O-labeling studies. Isomeric WOH lost mainly H(2)O + CO, while WOOH showed preferential elimination of C(2)H(5)NO(3) by two distinct mechanisms. Differences in the spatial arrangement of the two isomeric WOHs led to differences in the intensities of the fragment ions. The same behavior was also found for trans and cis WOOH. FMK was shown to dissociate by a diverse range of mechanisms, with the loss of ammonia the most favored route. MS/MS analyses, (18)O-labeling, and H(2)(18)O experiments demonstrated the ability of FMK to exchange its oxygen atoms with water. Moreover, this approach also revealed that the carbonyl group has more pronounced oxygen exchange ability compared with the formyl group. The understanding of fragmentation mechanisms involved in O(2) ((1)Delta(g))-mediated oxidation of W provides a useful step toward the structural characterization of oxidized peptides and proteins. (J Am Soc Mass Spectrom 2009, 20, 188-197) (C) 2009 Published by Elsevier Inc. on behalf of American Society for Mass Spectrometry
Resumo:
Polynorbonerne with high molecular weight was obtained via ring opening metathesis polymerization using catalysts derived from [RuCl(2)(PPh(2)Bz)(2) L] (1 for L = PPh(2) Bz; 2 for L = piperidine) type of complexes when in the presence of ethyl diazoacetate in CHCl(3). The polymer precipitated within a few minutes at 50 degrees C when using 1 with ca. 50% yield ([NBE]/[Ru] = 5000). Regarding 2, for either 30 min at 25 C or 5 min at 50 degrees C, more than 90% of yields are obtained; and at 50 C for 30 min a quantitative yield is obtained. The yield and PDI values are sensitive to the [NBE]/[Ru] ratio. The reaction of 1 with either isonicotinamide or nicotinamide produces six-coordinated complexes of [RuCl(2)(PPh(2)Bz)(2)(L)(2)] type, which are almost inactive and produce only small amounts of polymers at 50 C for 30 min. Thus, we Concluded that the novel complexes show very distinct reactivities for ROMP of NBE. This has been rationalized on account of a combination of synergistic effects of the phosphine-amine ancillary ligands. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The pulp- and paper production is a very energy intensive industry sector. Both Sweden and the U.S. are major pulpandpaper producers. This report examines the energy and the CO2-emission connected with the pulp- and paperindustry for the two countries from a lifecycle perspective.New technologies make it possible to increase the electricity production in the integrated pulp- andpaper mill through black liquor gasification and a combined cycle (BLGCC). That way, the mill canproduce excess electricity, which can be sold and replace electricity produced in power plants. In thisprocess the by-products that are formed at the pulp-making process is used as fuel to produce electricity.In pulp- and paper mills today the technology for generating energy from the by-product in aTomlinson boiler is not as efficient as it could be compared to the BLGCC technology. Scenarios havebeen designed to investigate the results from using the BLGCC technique using a life cycle analysis.Two scenarios are being represented by a 1994 mill in the U.S. and a 1994 mill in Sweden.The scenariosare based on the average energy intensity of pulp- and paper mills as operating in 1994 in the U.S.and Sweden respectively. The two other scenarios are constituted by a »reference mill« in the U.S. andSweden using state-of-the-art technology. We investigate the impact of varying recycling rates and totalenergy use and CO2-emissions from the production of printing and writing paper. To economize withthe wood and that way save trees, we can use the trees that are replaced by recycling in a biomassgasification combined cycle (BIGCC) to produce electricity in a power station. This produces extra electricitywith a lower CO2 intensity than electricity generated by, for example, coal-fired power plants.The lifecycle analysis in this thesis also includes the use of waste treatment in the paper lifecycle. Both Sweden and theU.S. are countries that recycle paper. Still there is a lot of paper waste, this paper is a part of the countries municipalsolid waste (MSW). A lot of the MSW is landfilled, but parts of it are incinerated to extract electricity. The thesis hasdesigned special scenarios for the use of MSW in the lifecycle analysis.This report is studying and comparing two different countries and two different efficiencies on theBLGCC in four different scenarios. This gives a wide survey and points to essential parameters to specificallyreflect on, when making assumptions in a lifecycle analysis. The report shows that there arethree key parameters that have to be carefully considered when making a lifecycle analysis of wood inan energy and CO2-emission perspective in the pulp- and paper mill in the U.S. and in Sweden. First,there is the energy efficiency in the pulp- and paper mill, then the efficiency of the BLGCC and last theCO2 intensity of the electricity displaced by BIGCC or BLGCC generatedelectricity. It also show that with the current technology that we havetoday, it is possible to produce CO2 free paper with a waste paper amountup to 30%. The thesis discusses the system boundaries and the assumptions.Further and more detailed research, including amongst others thesystem boundaries and forestry, is recommended for more specificanswers.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
Industrial companies in developing countries are facing rapid growths, and this requires having in place the best organizational processes to cope with the market demand. Sales forecasting, as a tool aligned with the general strategy of the company, needs to be as much accurate as possible, in order to achieve the sales targets by making available the right information for purchasing, planning and control of production areas, and finally attending in time and form the demand generated. The present dissertation uses a single case study from the subsidiary of an international explosives company based in Brazil, Maxam, experiencing high growth in sales, and therefore facing the challenge to adequate its structure and processes properly for the rapid growth expected. Diverse sales forecast techniques have been analyzed to compare the actual monthly sales forecast, based on the sales force representatives’ market knowledge, with forecasts based on the analysis of historical sales data. The dissertation findings show how the combination of both qualitative and quantitative forecasts, by the creation of a combined forecast that considers both client´s demand knowledge from the sales workforce with time series analysis, leads to the improvement on the accuracy of the company´s sales forecast.
Resumo:
In this dissertation, different ways of combining neural predictive models or neural-based forecasts are discussed. The proposed approaches consider mostly Gaussian radial basis function networks, which can be efficiently identified and estimated through recursive/adaptive methods. Two different ways of combining are explored to get a final estimate – model mixing and model synthesis –, with the aim of obtaining improvements both in terms of efficiency and effectiveness. In the context of model mixing, the usual framework for linearly combining estimates from different models is extended, to deal with the case where the forecast errors from those models are correlated. In the context of model synthesis, and to address the problems raised by heavily nonstationary time series, we propose hybrid dynamic models for more advanced time series forecasting, composed of a dynamic trend regressive model (or, even, a dynamic harmonic regressive model), and a Gaussian radial basis function network. Additionally, using the model mixing procedure, two approaches for decision-making from forecasting models are discussed and compared: either inferring decisions from combined predictive estimates, or combining prescriptive solutions derived from different forecasting models. Finally, the application of some of the models and methods proposed previously is illustrated with two case studies, based on time series from finance and from tourism.