967 resultados para Preservation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands m ? long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( 1000 L m ? = ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( 5 L m ? = ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Onset and evolution of the Rayleigh-Benard (R-B) convection are investigated using the Information Preservation (IP) method. The information velocity and temperature are updated using the Octant Flux Splitting (OFS) model developed by Masters & Ye based on the Maxwell transport equation suggested by Sun & Boyd. Statistical noise inherent in particle approaches such as the direct simulation Monte Carlo (DSMC) method is effectively reduced by the IP method, and therefore the evolutions from an initial quiescent fluid to a final steady state are shown clearly. An interesting phenomenon is observed: when the Rayleigh number (Ra) exceeds its critical value, there exists an obvious incubation stage. During the incubation stage, the vortex structure clearly appears and evolves, whereas the Nusselt number (Nu) of the lower plate is close to unity. After the incubation stage, the vortex velocity and Nu rapidly increase, and the flow field quickly reaches a steady, convective state. A relation of Nu to Ra given by IP agrees with those given by DSMC, the classical theory and experimental data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In contrast to cost modeling activities, the pricing of services must be simple and transparent. Calculating and thus knowing price structures, would not only help identify the level of detail required for cost modeling of individual instititutions, but also help develop a ”public” market for services as well as clarify the division of task and the modeling of funding and revenue streams for data preservation of public institutions. This workshop has built on the results from the workshop ”The Costs and Benefits of Keeping Knowledge” which took place 11 June 2012 in Copenhagen. This expert workshop aimed at: •Identifying ways for data repositories to abstract from their complicated cost structures and arrive at one transparent pricing structure which can be aligned with available and plausible funding schemes. Those repositories will probably need a stable institutional funding stream for data management and preservation. Are there any estimates for this, absolute or as percentage of overall cost? Part of the revenue will probably have to come through data management fees upon ingest. How could that be priced? Per dataset, per GB or as a percentage of research cost? Will it be necessary to charge access prices, as they contradict the open science paradigm? •What are the price components for pricing individual services, which prices are currently being paid e.g. to commercial providers? What are the description and conditions of the service(s) delivered and guaranteed? •What types of risks are inherent in these pricing schemes? •How can services and prices be defined in an all-inclusive and simple manner, so as to enable researchers to apply for specific amount when asking for funding of data-intensive projects?Please

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organised by Knowledge Exchange & the Nordbib programme 11 June 2012, 8:30-12:30, Copenhagen Adjacent to the Nordbib conference 'Structural frameworks for open, digital research' Participants in break out discussion during the workshop on cost modelsThe Knowledge Exchange and the Nordbib programme organised a workshop on cost models for the preservation and management of digital collections. The rapid growth of the digital information which a wide range of institutions must preserve emphasizes the need for robust cost modelling. Such models should enable these institutions to assess both what resources are needed to sustain their digital preservation activities and allow comparisons of different preservation solutions in order to select the most cost-efficient alternative. In order to justify the costs institutions also need to describe the expected benefits of preserving digital information. This workshop provided an overview of existing models and demonstrated the functionality of some of the current cost tools. It considered the specific economic challenges with regard to the preservation of research data and addressed the benefits of investing in the preservation of digital information. Finally, the workshop discussed international collaboration on cost models. The aim of the workshop was to facilitate understanding of the economies of data preservation and to discuss the value of developing an international benchmarking model for the costs and benefits of digital preservation. The workshop took place in the Danish Agency for Culture and was planned directly prior to the Nordbib conference 'Structural frameworks for open, digital research'

Relevância:

20.00% 20.00%

Publicador:

Resumo:

23 p. -- An extended abstract of this work appears in the proceedings of the 2012 ACM/IEEE Symposium on Logic in Computer Science

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The traditional approach to fish handling, preservation and processing technology in inland fishery is critically examined using the experience in Kainji Lake as a model. The need to uplift the fishermen technology is emphasized with the ultimate expectations of improvement in fish quality

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] This paper is an outcome of the following dissertation:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents, for three aquatic research projects, the type of data that were collected, and the reasons why these data eventually became lost or inaccessible. A strategy for countering such data loss is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Body length measurement is an important part of growth, condition, and mortality analyses of larval and juvenile fish. If the measurements are not accurate (i.e., do not reflect real fish length), results of subsequent analyses may be affected considerably (McGurk, 1985; Fey, 1999; Porter et al., 2001). The primary cause of error in fish length measurement is shrinkage related to collection and preservation (Theilacker, 1980; Hay, 1981; Butler, 1992; Fey, 1999). The magnitude of shrinkage depends on many factors, namely the duration and speed of the collection tow, abundance of other planktonic organisms in the sample (Theilacker, 1980; Hay, 1981; Jennings, 1991), the type and strength of the preservative (Hay, 1982), and the species of fish (Jennings, 1991; Fey, 1999). Further, fish size affects shrinkage (Fowler and Smith, 1983; Fey, 1999, 2001), indicating that live length should be modeled as a function of preserved length (Pepin et al., 1998; Fey, 1999).