963 resultados para Wood Preservation
Resumo:
Environmental inputs can improve the level of innovation by interconnecting them with traditional inputs regarding the properties of materials and processes as a strategic eco-design procedure. Advanced engineered polymer composites are needed to meet the diverse needs of users for high-performance automotive, construction and commodity products that simultaneously maximize the sustainability of forest resources. In the current work, wood polymer composites (WPC) are studied to promote long-term resource sustainability and to decrease environmental impacts relative to those of existing products. A series of polypropylene wood–fiber composite materials having 20, 30, 40 and 50 wt. % of wood–fibers were prepared using twin-screw extruder and injection molding machine. Tensile and flexural properties of the composites were determined. Polypropylene (PP) as a matrix used in this study is a thermoplastic material, which is recyclable. Suitability of the prepared composites as a sustainable product is discussed.
Resumo:
Metal-organic frameworks (MOFs) and boron nitride both possess novel properties, the former associated with microporosity and the latter with good mechanical properties. We have synthesized composites of the imidazolate based MOF, ZIF-8, and few-layer BN in order to see whether we can incorporate the properties of both these materials in the composites. The composites so prepared between BN nanosheets and ZIF-8 have compositions ZIF-1BN, ZIF-2BN, ZIF-3BN and similar to ZIF-4BN. The composites have been characterized by PXRD, TGA, XPS, electron microscopy, IR, Raman and solid state NMR spectroscopy. The composites possess good surface areas, the actual value decreasing only slightly with the increase in the BN content. The CO2 uptake remains nearly the same in the composites as in the parent ZIF-8. More importantly, the addition of BN markedly improves the mechanical properties of ZIF-8, a feature that is much desired in MOFs. Observation of microporous features along with improved mechanical properties in a MOF is indeed noteworthy. Such manipulation of properties can be profitably exploited in practical applications.
Resumo:
The disclosure of information and its misuse in Privacy Preserving Data Mining (PPDM) systems is a concern to the parties involved. In PPDM systems data is available amongst multiple parties collaborating to achieve cumulative mining accuracy. The vertically partitioned data available with the parties involved cannot provide accurate mining results when compared to the collaborative mining results. To overcome the privacy issue in data disclosure this paper describes a Key Distribution-Less Privacy Preserving Data Mining (KDLPPDM) system in which the publication of local association rules generated by the parties is published. The association rules are securely combined to form the combined rule set using the Commutative RSA algorithm. The combined rule sets established are used to classify or mine the data. The results discussed in this paper compare the accuracy of the rules generated using the C4. 5 based KDLPPDM system and the CS. 0 based KDLPPDM system using receiver operating characteristics curves (ROC).
Resumo:
This study focuses on addressing the propagation front movement in a co-current downdraft gasification system. A detailed single particle modeling analysis extended to the packed bed reactor is used to compare with the experimental measurement as well those available in the literature. This model for biomass gasification systems considered pyrolysis process, gas phase volatile combustion, and heterogeneous char reactions along with gas phase reactions in the packed bed. The pyrolysis kinetics has a critical influence on the gasification process. The propagation front has been shown to increase with air mass flux, attains a peak and then decreases with further increase in air mass flux and finally approaches negative propagation rate. This indicates that front is receding, or no upward movement() bra her it is moving downward towards the char bed. The propagation rate correlates with mass flux as (m) over dot `'(0.883) during the increasing regimes of the front movement The study clearly identifies that bed movement is an important parameter for consideration in a co-current configuration towards establishing the effective bed movement. The study also highlights the importance of surface area to volume ratio of the particles in the packed bed and its influence on the volatile generation. Finally, the gas composition for air gasification under various air mass fluxes is compared with the experimental results. (C) 2016 Elsevier B.V. All rights reserved.
Resumo:
An elastic-plastic constitutive model for transversely isotropic compressible solids (foams) has been developed. A quadratic yield surface with four parameters and one hardening function is proposed. Associated plastic flow is assumed and the yield surface evolves in a self-similar manner calibrated by the uniaxial compressive (or tensile) response of the cellular solid in the axial direction. All material constants in the model (elastic and plastic) can be determined from a combination of a total of four uniaxial and shear tests. The model is used to predict the indentation response of balsa wood to a conical indenter. For the three cone angles considered in this study, very good agreement is found between the experimental measurements and the finite element (FE) predictions of the transversely isotropic cellular solid model. On the other hand, an isotropic foam model is shown to be inadequate to capture the indentation response. © 2005 Elsevier Ltd. All rights reserved.
Resumo:
[EUS]Antzinako errejimenean basoa bizitzaren bizigunea zen.Garai honetan jarduera gehienak basoaren inguruan kokatzen ziren: gizakumeen elikadura, abereen janaria, ehiza,etxeko sua, ikatzaren industria,eraikinak, nekazaritza, etxeko eta teknika tresnak, ontzigintza, e.a. Horregati, basoak izan zuen bilakaera jarduera hauekin oso lotuta zegoen. Gipuzkoa eta Bizkaia jarduera hauen arteko eta basoaren inguruko aurkakotasuna bortitza baino bortitzagoa izan zuen. Egun, gauzak aldatu dira, basoak ez du ekonomiarekiko hainbeste garrantzirik. Momentu honetan basoarekiko interesa izadiaren kontserbatzioan eta suntsiketan datza
Resumo:
This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.
Resumo:
This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands m ? long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( 1000 L m ? = ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( 5 L m ? = ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.
Resumo:
Onset and evolution of the Rayleigh-Benard (R-B) convection are investigated using the Information Preservation (IP) method. The information velocity and temperature are updated using the Octant Flux Splitting (OFS) model developed by Masters & Ye based on the Maxwell transport equation suggested by Sun & Boyd. Statistical noise inherent in particle approaches such as the direct simulation Monte Carlo (DSMC) method is effectively reduced by the IP method, and therefore the evolutions from an initial quiescent fluid to a final steady state are shown clearly. An interesting phenomenon is observed: when the Rayleigh number (Ra) exceeds its critical value, there exists an obvious incubation stage. During the incubation stage, the vortex structure clearly appears and evolves, whereas the Nusselt number (Nu) of the lower plate is close to unity. After the incubation stage, the vortex velocity and Nu rapidly increase, and the flow field quickly reaches a steady, convective state. A relation of Nu to Ra given by IP agrees with those given by DSMC, the classical theory and experimental data.
Resumo:
In contrast to cost modeling activities, the pricing of services must be simple and transparent. Calculating and thus knowing price structures, would not only help identify the level of detail required for cost modeling of individual instititutions, but also help develop a ”public” market for services as well as clarify the division of task and the modeling of funding and revenue streams for data preservation of public institutions. This workshop has built on the results from the workshop ”The Costs and Benefits of Keeping Knowledge” which took place 11 June 2012 in Copenhagen. This expert workshop aimed at: •Identifying ways for data repositories to abstract from their complicated cost structures and arrive at one transparent pricing structure which can be aligned with available and plausible funding schemes. Those repositories will probably need a stable institutional funding stream for data management and preservation. Are there any estimates for this, absolute or as percentage of overall cost? Part of the revenue will probably have to come through data management fees upon ingest. How could that be priced? Per dataset, per GB or as a percentage of research cost? Will it be necessary to charge access prices, as they contradict the open science paradigm? •What are the price components for pricing individual services, which prices are currently being paid e.g. to commercial providers? What are the description and conditions of the service(s) delivered and guaranteed? •What types of risks are inherent in these pricing schemes? •How can services and prices be defined in an all-inclusive and simple manner, so as to enable researchers to apply for specific amount when asking for funding of data-intensive projects?Please
Resumo:
Organised by Knowledge Exchange & the Nordbib programme 11 June 2012, 8:30-12:30, Copenhagen Adjacent to the Nordbib conference 'Structural frameworks for open, digital research' Participants in break out discussion during the workshop on cost modelsThe Knowledge Exchange and the Nordbib programme organised a workshop on cost models for the preservation and management of digital collections. The rapid growth of the digital information which a wide range of institutions must preserve emphasizes the need for robust cost modelling. Such models should enable these institutions to assess both what resources are needed to sustain their digital preservation activities and allow comparisons of different preservation solutions in order to select the most cost-efficient alternative. In order to justify the costs institutions also need to describe the expected benefits of preserving digital information. This workshop provided an overview of existing models and demonstrated the functionality of some of the current cost tools. It considered the specific economic challenges with regard to the preservation of research data and addressed the benefits of investing in the preservation of digital information. Finally, the workshop discussed international collaboration on cost models. The aim of the workshop was to facilitate understanding of the economies of data preservation and to discuss the value of developing an international benchmarking model for the costs and benefits of digital preservation. The workshop took place in the Danish Agency for Culture and was planned directly prior to the Nordbib conference 'Structural frameworks for open, digital research'
Resumo:
23 p. -- An extended abstract of this work appears in the proceedings of the 2012 ACM/IEEE Symposium on Logic in Computer Science
Resumo:
The traditional approach to fish handling, preservation and processing technology in inland fishery is critically examined using the experience in Kainji Lake as a model. The need to uplift the fishermen technology is emphasized with the ultimate expectations of improvement in fish quality