971 resultados para Alkanol preservation index
Resumo:
Esta coleção, geralmente conhecida por Grineu, foi compilada por Johann Huttich e publicada por Simon Grineu, que também escreveu e assinou o prefácio. A primeira edição, "muito e rara e de inestimável valor", segundo Borba de Moraes, foi publicada na Basiléia por Jo. Hervagium, em 1532. Dessa edição, a Biblioteca Nacional do Rio de Janeiro possui um exemplar que pertencia à Biblioteca da Ajuda. A obra foi reimpressa em Paris, em 1532, e novamente na Basiléia, em 1537. Esta edição foi acrescida da carta de Maximilianus Transilvanus, secretário de Carlos V, ao Cardeal de Salzburgo. Dentre as várias ediçoes, a mais completa é a de 1555. Relata as grandes navegações e as expedições de Cristovão Colombo, Pedro Alonso, Pinzoni, Americo Vespucio, dentre outros
Resumo:
This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.
Resumo:
This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands m ? long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( 1000 L m ? = ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( 5 L m ? = ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.
Resumo:
Onset and evolution of the Rayleigh-Benard (R-B) convection are investigated using the Information Preservation (IP) method. The information velocity and temperature are updated using the Octant Flux Splitting (OFS) model developed by Masters & Ye based on the Maxwell transport equation suggested by Sun & Boyd. Statistical noise inherent in particle approaches such as the direct simulation Monte Carlo (DSMC) method is effectively reduced by the IP method, and therefore the evolutions from an initial quiescent fluid to a final steady state are shown clearly. An interesting phenomenon is observed: when the Rayleigh number (Ra) exceeds its critical value, there exists an obvious incubation stage. During the incubation stage, the vortex structure clearly appears and evolves, whereas the Nusselt number (Nu) of the lower plate is close to unity. After the incubation stage, the vortex velocity and Nu rapidly increase, and the flow field quickly reaches a steady, convective state. A relation of Nu to Ra given by IP agrees with those given by DSMC, the classical theory and experimental data.
Resumo:
Presentación (p. 9-27). Index Verborum (p.65-89). Letra "P" (extraido del CD-Rom, p.1-184)
Resumo:
In contrast to cost modeling activities, the pricing of services must be simple and transparent. Calculating and thus knowing price structures, would not only help identify the level of detail required for cost modeling of individual instititutions, but also help develop a ”public” market for services as well as clarify the division of task and the modeling of funding and revenue streams for data preservation of public institutions. This workshop has built on the results from the workshop ”The Costs and Benefits of Keeping Knowledge” which took place 11 June 2012 in Copenhagen. This expert workshop aimed at: •Identifying ways for data repositories to abstract from their complicated cost structures and arrive at one transparent pricing structure which can be aligned with available and plausible funding schemes. Those repositories will probably need a stable institutional funding stream for data management and preservation. Are there any estimates for this, absolute or as percentage of overall cost? Part of the revenue will probably have to come through data management fees upon ingest. How could that be priced? Per dataset, per GB or as a percentage of research cost? Will it be necessary to charge access prices, as they contradict the open science paradigm? •What are the price components for pricing individual services, which prices are currently being paid e.g. to commercial providers? What are the description and conditions of the service(s) delivered and guaranteed? •What types of risks are inherent in these pricing schemes? •How can services and prices be defined in an all-inclusive and simple manner, so as to enable researchers to apply for specific amount when asking for funding of data-intensive projects?Please
Resumo:
Organised by Knowledge Exchange & the Nordbib programme 11 June 2012, 8:30-12:30, Copenhagen Adjacent to the Nordbib conference 'Structural frameworks for open, digital research' Participants in break out discussion during the workshop on cost modelsThe Knowledge Exchange and the Nordbib programme organised a workshop on cost models for the preservation and management of digital collections. The rapid growth of the digital information which a wide range of institutions must preserve emphasizes the need for robust cost modelling. Such models should enable these institutions to assess both what resources are needed to sustain their digital preservation activities and allow comparisons of different preservation solutions in order to select the most cost-efficient alternative. In order to justify the costs institutions also need to describe the expected benefits of preserving digital information. This workshop provided an overview of existing models and demonstrated the functionality of some of the current cost tools. It considered the specific economic challenges with regard to the preservation of research data and addressed the benefits of investing in the preservation of digital information. Finally, the workshop discussed international collaboration on cost models. The aim of the workshop was to facilitate understanding of the economies of data preservation and to discuss the value of developing an international benchmarking model for the costs and benefits of digital preservation. The workshop took place in the Danish Agency for Culture and was planned directly prior to the Nordbib conference 'Structural frameworks for open, digital research'
Resumo:
This book elucidates the methods of molecular gas dynamics or rarefied gas dynamics which treat the problems of gas flows when the discrete molecular effects of the gas prevail under the circumstances of low density, the emphases being stressed on the basis of the methods, the direct simulation Monte Carlo method applied to the simulation of non-equilibrium effects and the frontier subjects related to low speed microscale rarefied gas flows. It provides a solid basis for the study of molecular gas dynamics for senior students and graduates in the aerospace and mechanical engineering departments of universities and colleges. It gives a general acquaintance of modern developments of rarefied gas dynamics in various regimes and leads to the frontier topics of non-equilibrium rarefied gas dynamics and low speed microscale gas dynamics. It will be also of benefit to the scientific and technical researchers engaged in aerospace high altitude aerodynamic force and heating design and in the research on gas flow in MEMS
[1] Molecular structure and energy states | (21) | ||
[2] Some basic concepts of kinetic theory | (51) | ||
[3] Interaction of molecules with solid surface | (131) | ||
[4] Free molecular flow | (159) | ||
[5] Continuum models | (191) | ||
[6] Transitional regime | (231) | ||
[7] Direct simulation Monte-Carlo (DSMC) method | (275) | ||
[8] Microscale slow gas flows, information preservation method | (317) | ||
[App. I] Gas properties | (367) | ||
[App. II] Some integrals | (369) | ||
[App. III] Sampling from a prescribed distribution | (375) | ||
[App. IV] Program of the couette flow | (383) | ||
Subject Index | (399) |
Resumo:
ENGLISH: In the eastern Pacific Ocean nearly all of the commercial catches of yellowfin tuna (Thunnus albacares) and skipjack (Katsuwonus pelamis) are taken by two types of vessels, baitboats, which use pole and line in conjunction with live-bait, and purse-seiners. From its inception until very recently (1959), this fishery was dominated by baitboats. This method of fishing has been described by Godsil (1938) and Shimada and Schaefer (1956). From 1951 through 1958 baitboats caught between 66.4 and 90.8 per cent of the yellowfin and between 87.2 and 95.3 per cent of the skipjack landed by the California-based fleet. These vessels fished for tuna throughout the year and covered virtually all of the area from southern California to northern Chile. The purse-seine fishery for tunas developed out of the round-haul net fisheries for California sardines and other species. Scofield (1951) gives a detailed description of the development of gear and fishing methods. Prior to 1959 many of the seiners engaged in other fisheries during the fall and early winter months and consequently most of the fishing effort for tuna occurred in the period February-August. The vessels were quite small, averaging approximately 120 tons carrying capacity (Broadhead and Marshall, 1960), in comparison to the baitboats, of which the most numerous size-class was 201-300 tons. The seiners were naturally more restricted in range than the baitboats and most of their effort was restricted to the northern grounds. During the period 1959-61 most of the large baitboats were converted for purse-seining and the existing seiner fleet was modernized. These developments increased the range of the seiner fleet and resulted in a wider and more nearly even spatial and temporal distribution of effort. By the early part of 1961, the purse-seine fleet approximated the level of the preconversion baitboat fleet in amount of effort applied and area covered. The changes in the purse-seine fishery and the fishing methods employed in the modernized fleet are described by Orange and Broadhead (1959), Broadhead and Marshall (1960), McNeely (1961) and Broadhead (1962). The change in the relative importance of the two gears is illustrated by the decline in the proportion of the total logged tonnage landed by California-based baitboats, in comparison to the proportion landed by seiners. In 1959 baitboats landed 49.5 per cent of the yellowfin and 87.8 per cent of the skipjack. In 1960 these percentages were 22.9 and 74.7 respectively and in 1961 the decline continued to 12.6 per cent of the yellowfin and 30.0 per cent of the skipjack (Schaefer, 1962). In previous Bulletins of this Commission (Griffiths, 1960; Calkins, 1961) the baitboat catch and effort statistics were used to compute two indices of population density and an index of concentration of fishing effort and the fluctuations of these indices were analyzed in some detail. Due to the change in the relative importance of the two gears it is appropriate to extend this investigation to include the purse-seine data. The objectives of this paper are to compute two indices of population density and an index of concentration of fishing effort and to examine the fluctuations in these indices before and after the changes in the fishery. A further objective is to compare the purse-seine indices with those of the baitboats for the same time periods. SPANISH: En el Océano Pacífico Oriental casi todas las capturas comerciales del atún aleta amarilla (Thunnus albacares) y del barrilete (Katsuwonus pelamis) son efectuadas por dos tipos de barcos, los barcos de carnada que emplean la caña y el anzuelo en conjunto con la carnada viva, y los barcos rederos. Desde su comienzo hasta hace poco tiempo (1959), esta pesquería estaba dominada por los barcos de carnada. El método de pesca usado por estos barcos ha sido descrito por Godsil (1938) y por Shimada y Schaefer (1956). De 1951 a 1958, los barcos de carnada pescaron entre el 66.4 y el 90.8 por ciento del atún aleta amarilla y entre el 87.2 y el 95.3 por ciento del barrilete descargados por la flota que tiene su base en California. Estos barcos pescaron atún durante todo el año y cubrieron virtualmente toda el área de California meridional hasta la parte norte de Chile. La pesquería del atún con redes de cerco se originó en las pesquerías de las sardinas de California y otras especies, con redes que se remolcaban circularmente. Scofield (1951) dá una descripción detallada del desarrollo de los métodos y del equipo de pesca. Antes de 1959 muchos de los rederos se dedicaban a otras pesquerías durante los meses del otoño y a principios del invierno y consecuentemente, la mayor parte del esfuerzo depesca para la producción del atún ocurría en el período febrero-agosto. Las embarcaciones eran bastante pequeñas, con un promedio de aproximadamente 120 toneladas de capacidad para el transporte (Broadhead y Marshall, 1960) en comparación con los barcos de carnada, de los cuales la clase de tamaño más numerosa era de 201 a 300 toneladas. Los rederos estaban naturalmente más restringidos en su radio de acción que los barcos de carnada y la mayor parte de su esfuerzo se limitaba a las localidades del norte. Durante el período 1959-61, la mayoría de los grandes barcos de carnada fueron convertidos al sistema de pesca con redes de cerco, y se modernizó la flota existente de los rederos. Estos cambios aumentaron el alcance de la flota de los barcos rederos dando como resultado una distribución más amplia y casi más uniforme del esfuerzo espaciado y temporal. En la primera parte del año 1961, la flota de rederos se aproximó al nivel de la preconversión de la flota de clipers, en la cantidad de esfuerzo aplicado y al área comprendida. Los cambios en la pesquería con red y los métodos de pesca empleados en la flota modernizada, han sido descritos por Orange y Broadhead (1959), Broadl1ead y Marshall (1960), McNeely (1961) y Broadhead (1962). El cambio en la importancia relativa de los dos sistemas de pesca está ilustrado por la declinación en la proporción del tonelaje total registrado, como descargado por los barcos de carnada que tienen su base en California, comparado con la proporción desembarcada por los barcos rederos. En 1959 los clipers descargaron el 49.5 por ciento del atún aleta amarilla y el 87.8 por ciento del barrilete. En 1960 estos porcentajes fueron del 22.9 y 74.7 respectivamente, y en 1961 continuó la reducción hasta el 12.6 por ciento del atún aleta amarilla y el 30.0 por ciento del barrilete (Schaefer, 1962). En Boletines anteriores de la Comisión (Griffiths, 1960; Calkins, 1961) las estadísticas de la pesca y el esfuerzo de los clipers se utilizaron para computar dos índices de la densidad de población y un índice de la concentración del esfuerzo de pesca, y se analizaron algo detalladamente las fluctuaciones de estos índices. Debido al cambio en la importancia relativa de los dos sistemas de pesca, es conveniente extender esta investigación para incluir los datos correspondientes a los barcos rederos. Los objetivos del presente estudio son de computar dos índices de la densidad de población y un índice de la concentración del esfuerzo de pesca, y examinar las fluctuaciones en estos índices, antes y después de los cambios en la pesquería. Otro objetivo es de comparar los índices de los barcos rederos, con aquellos de los clipers en los mismos períodos de tiempo.
Resumo:
Sob pé de imprenta: "Com as licenças necessarias, e privilegio real".
Resumo:
This study is concerned with the measurement of total factor prodnctivity in the marine fishing industries in general and in the Pacific coast trawl fishery in particular. The study is divided into two parts. Part I contains suitable empirical and introductory theoretical material for the examination of productivity in the Pacific coast trawl Deet. It is self-contained, and contains the basic formulae, empirical results, and discussion. Because the economic theory of index numbers and productivity is constantly evolving and is widely scattered throughout the economics literature, Part D draws together the theoretical literature into one place to allow ready access for readers interested in more details. The major methodological focus of the study is upon the type of economic index number that is most appropriate for use by economists with the National Marine Fisheries Service. This study recommends that the following types of economic index numbers be used: chain rather than fIxed base; bilateral rather than multilateral; one of the class of superlative indices, such as the Tornqvist or Fisher Ideal. (PDF file contains 40 pages.)
Resumo:
The following series of fishery publications produced in calendar years 1980-85 by the Scientific Publications OffIce of the National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration (NOAA), are listed numerically and indexed by author and subject: Circular, Fishery BuUetin, Marine Fisheries Review, Special Scientific Report-Fisheries, and Technical Report NMFS. Also included is an alphanumeric listing of the NOAA Technical Memorandum NMFS series published in calendar years 1972-85 by NMFS regional offices and fisheries centers. Authors and subjects for the Memoradum series are indexed with the other publication series. (PDF file contains 156 pages.)