894 resultados para consistent and asymptotically normal estimators
Resumo:
This paper presents an analysis of crack problems in homogeneous piezoelectrics or on the interfaces between two dissimilar piezoelectric materials based on the continuity of normal electric displacement and electric potential across the crack faces. The explicit analytic solutions are obtained for a single crack in an infinite piezoelectric or on the interface of piezoelectric bimaterials. For homogeneous materials it is found that the normal electric displacement D-2, induced by the crack, is constant along the crack faces which depends only on the remote applied stress fields. Within the crack slit, the perturbed electric fields induced by the crack are also constant and not affected by the applied electric displacement fields. For bimaterials, generally speaking, an interface crack exhibits oscillatory behavior and the normal electric displacement D-2 is a complex function along the crack faces. However, for bimaterials, having certain symmetry, in which an interface crack displays no oscillatory behavior, it is observed that the normal electric displacement D-2 is also constant along the crack faces and the electric field E-2 has the singularity ahead of the crack tip and has a jump across the interface. Energy release rates are established for homogeneous materials and bimaterials having certain symmetry. Both the crack front parallel to the poling axis and perpendicular to the poling axis are discussed. It is revealed that the energy release rates are always positive for stable materials and the applied electric displacements have no contribution to the energy release rates.
Resumo:
In this paper, we apply our compressible lattice Boltzmann model to a rotating parabolic coordinate system to simulate Rossby vortices emerging in a layer of shallow water flowing zonally in a rotating paraboloidal vessel. By introducing a scaling factor, nonuniform curvilinear mesh can be mapped to a flat uniform mesh and then normal lattice Boltzmann method works. Since the mass per unit area on the two-dimensional (2D) surface varies with the thickness of the water layer, the 2D flow seems to be "compressible" and our compressible model is applied. Simulation solutions meet with the experimental observations qualitatively. Based on this research, quantitative solutions and many natural phenomena simulations in planetary atmospheres, oceans, and magnetized plasma, such as the famous Jovian Giant Red Spot, the Galactic Spiral-vortex, the Gulf Stream, and the Kuroshio Current, etc,, can be expected.
Resumo:
Using a dislocation simulation approach, the basic equation for a crack perpendicular to a bimaterial interface is formulated in this paper. A novel expansion method is proposed for solving the problem. The complete solution for the problem, including the T stress ahead of the crack tip and the stress intensity factors are presented. The stress field characteristics are analyzed in detail. It is found that ahead of the crack tip and near the interface the normal stress, perpendicular to the crack plane, sigma(x), is characterized by the K fields and the normal stress sigma(y) is dominated by the K field plus T stress in the region of 0 < r/b < 0.4 for b/a(0) less than or equal to 0.1, where b is the distance from the crack tip to the interface.
Resumo:
A temperature-controlled pool boiling (TCPB) device has been developed to study the bubble behavior and heat transfer in pool boiling phenomenon both in normal gravity and in microgravity. A thin platinum wire of 60 mu m in diameter and 30 mm in length is simultaneously used as heater and thermometer. The fluid is R113 at 0.1 MPa and subcooled by 26 degrees C nominally for all cases. Three modes of heat transfer, namely single-phase natural convection, nucleate boiling, and two-mode transition boiling, are observed in the experiment both in microgravity aboard the 22nd Chinese recoverable satellite and in normal gravity on the ground before and after the space flight. Dynamic behaviors of vapor bubbles observed in these experiments are reported and analyzed in the present paper. In the regime of fully developed nucleate boiling, the interface oscillation due to coalescence of adjacent tiny bubbles is the primary reason of the departure of bubbles in microgravity. On the contrary, in the discrete bubble regime, it's observed that there exist three critical bubble diameters in microgravity, dividing the whole range of the observed bubbles into four regimes. Firstly, tiny bubbles are continually forming and growing on the heating surface before departing slowly from the wire when their sizes exceed some value of the order of 10(-1) mm. The bigger bubbles with about several millimeters in diameter stay on the wire, oscillate along the wire, and coalesce with adjacent bubbles. The biggest bubble with diameter of the order of 10 mm, which was formed immediately after the onset of boiling, stays continuously
Resumo:
Peel test measurements and simulations of the interfacial mechanical parameters for the Al/Epoxy/Al2O3 system are performed in the present investigation. A series of Al film thicknesses between 20 and 250 microns and three peel angles of 90, 135 and 180 degrees are considered. Two types of epoxy adhesives are adopted to obtain both strong and weak interface adhesions. A finite element model with cohesive zone elements is used to identify the interfacial parameters and simulate the peel test process. By simulating and recording normal stress near the crack tip, the separation strength is obtained. Furthermore, the cohesive energy is identified by comparing the simulated steady-state peel force and the experimental result. It is found from the research that both the cohesive energy and the separation strength can be taken as the intrinsic interfacial parameters which are dependent on the thickness of the adhesive layer and independent of the film thickness and peel angle.
Resumo:
This document, Guidance for Benthic Habitat Mapping: An Aerial Photographic Approach, describes proven technology that can be applied in an operational manner by state-level scientists and resource managers. This information is based on the experience gained by NOAA Coastal Services Center staff and state-level cooperators in the production of a series of benthic habitat data sets in Delaware, Florida, Maine, Massachusetts, New York, Rhode Island, the Virgin Islands, and Washington, as well as during Center-sponsored workshops on coral remote sensing and seagrass and aquatic habitat assessment. (PDF contains 39 pages) The original benthic habitat document, NOAA Coastal Change Analysis Program (C-CAP): Guidance for Regional Implementation (Dobson et al.), was published by the Department of Commerce in 1995. That document summarized procedures that were to be used by scientists throughout the United States to develop consistent and reliable coastal land cover and benthic habitat information. Advances in technology and new methodologies for generating these data created the need for this updated report, which builds upon the foundation of its predecessor.
Resumo:
EXECUTIVE SUMMARY: The Coastal Change Analysis Programl (C-CAP) is developing a nationally standardized database on landcover and habitat change in the coastal regions of the United States. C-CAP is part of the Estuarine Habitat Program (EHP) of NOAA's Coastal Ocean Program (COP). C-CAP inventories coastal submersed habitats, wetland habitats, and adjacent uplands and monitors changes in these habitats on a one- to five-year cycle. This type of information and frequency of detection are required to improve scientific understanding of the linkages of coastal and submersed wetland habitats with adjacent uplands and with the distribution, abundance, and health of living marine resources. The monitoring cycle will vary according to the rate and magnitude of change in each geographic region. Satellite imagery (primarily Landsat Thematic Mapper), aerial photography, and field data are interpreted, classified, analyzed, and integrated with other digital data in a geographic information system (GIS). The resulting landcover change databases are disseminated in digital form for use by anyone wishing to conduct geographic analysis in the completed regions. C-CAP spatial information on coastal change will be input to EHP conceptual and predictive models to support coastal resource policy planning and analysis. CCAP products will include 1) spatially registered digital databases and images, 2) tabular summaries by state, county, and hydrologic unit, and 3) documentation. Aggregations to larger areas (representing habitats, wildlife refuges, or management districts) will be provided on a case-by-case basis. Ongoing C-CAP research will continue to explore techniques for remote determination of biomass, productivity, and functional status of wetlands and will evaluate new technologies (e.g. remote sensor systems, global positioning systems, image processing algorithms) as they become available. Selected hardcopy land-cover change maps will be produced at local (1:24,000) to regional scales (1:500,000) for distribution. Digital land-cover change data will be provided to users for the cost of reproduction. Much of the guidance contained in this document was developed through a series of professional workshops and interagency meetings that focused on a) coastal wetlands and uplands; b) coastal submersed habitat including aquatic beds; c) user needs; d) regional issues; e) classification schemes; f) change detection techniques; and g) data quality. Invited participants included technical and regional experts and representatives of key State and Federal organizations. Coastal habitat managers and researchers were given an opportunity for review and comment. This document summarizes C-CAP protocols and procedures that are to be used by scientists throughout the United States to develop consistent and reliable coastal change information for input to the C-CAP nationwide database. It also provides useful guidelines for contributors working on related projects. It is considered a working document subject to periodic review and revision.(PDF file contains 104 pages.)
Resumo:
If a product is being designed to be genuinely inclusive, then the designers need to be able to assess the level of exclusion of the product that they are working on and to identify possible areas of improvement. To be of practical use, the assessments need to be quick, consistent and repeatable. The aim of this workshop is to invite attendees to participate in the evaluation of a number of everyday objects using an assessment technique being considered by the workshop organisers. The objectives of the workshop include evaluating the effectiveness of the assessment method, evaluating the accessibility of the products being assessed and to suggest revisions to the assessment scales being used. The assessment technique is to be based on the ONS capability measures [1]. This source recognises fourteen capability scales of which seven are particularly pertinent to product evaluation, namely: motion, dexterity, reach and stretch, vision, hearing, communication, and intellectual functioning. Each of these scales ranges from 0 (fully able) through 1 (minimal impairment) to 10 (severe impairment). The attendees will be asked to rate the products on these scales. Clearly the assessed accessibility of the product depends on the assumptions made about the context of use. The attendees will be asked to clearly note the assumptions that they are making about the context in which the product is being assessed. For instance, with a hot water bottle, assumptions have to be made about the availability of hot water and these can affect the overall accessibility rating. The workshop organisers will not specify the context of use as the aim is to identify how assessors would use the assessment method in the real world. The objects being assessed will include items such as remote controls, pill bottles, food packaging, hot water bottles and mobile telephones. the attendees will be encouraged to assess two or more products in detail. Helpers will be on hand to assist and observe the assessments. The assessments will be collated and compared and feedback about the assessment method sought from the attendees. Drawing on a preliminary review of the assessment results, initial conclusions will be presented at the end of the workshop. More detailed analyses will be made available in subsequent proceedings. It is intended that the workshop will provide workshop attendees with an opportunity to perform hands-on assessment of a number everyday products and identify features which are inclusive and those that are not. It is also intended to encourage an appreciation of the capabilities to be considered when evaluating accessibility.
Resumo:
12 p.
Resumo:
For 10 years the Institute for Fishing Technology, Hamburg (IFH) has been carrying out experiments in the brown shrimp fishery with beam trawls aiming at a reduction of unwanted bycatches. When the tests were transferred to commercial fishery conditions the personnel effort and costs increased markedly. It became e.g. necessary to install a deep-freeze chain to make it possible to evaluate more samples in the laboratory. This again required to increase the number of technicians for measuring the fish and shrimp samples, but also made it necessary to perform this work in the most rational and time-saving way by applying modern electronic aids. Though all samples still have to be sorted by species and have to be weighed and measured the introduction of electronic aids, however, like electronic measuring board and computer-aided image processing system, all weight and length data are immediately and digitally recorded after processing. They are transferred via a network to a server PC which stores them into a purpose-designed database. This article describes the applicationof two electronic systems: the measuring board (FM 100, Fa. SCANTROL), iniated by a project in the Norwegian Institute for Fishing Technology, and a computer-aided image processing system, focussing on measuring shrimps in their naturally flexed shape, also developed in the Institute for Fishing Technology in close collaboration with the University of Duisburg. These electronic recording systems allow the consistent and reproducible record of data independent of the changing day-to-day personal form of the staff operating them. With the help of these systems the number of measurements the laboratory could be maximized to 250 000 per year. This made it possible to evaluate, in 1999, 525 catch samples from 75 commercial hauls taken during 15 days at sea. The time gain in measuring the samples is about one third of the time previously needed (i.e. one hour per sample). An additional advantage is the immediate availability of the digitally stored data which enables rapid analyses of all finished subexperiments. Both systems are applied today in several institutes of the Federal Research Centre. The image processing system is now the standard measuring method in an international research project.
Resumo:
The aim of this paper is to propose a new solution for the roommate problem with strict preferences. We introduce the solution of maximum irreversibility and consider almost stable matchings (Abraham et al. [2])and maximum stable matchings (Ta [30] [32]). We find that almost stable matchings are incompatible with the other two solutions. Hence, to solve the roommate problem we propose matchings that lie at the intersection of the maximum irreversible matchings and maximum stable matchings, which are called Q-stable matchings. These matchings are core consistent and we offer an effi cient algorithm for computing one of them. The outcome of the algorithm belongs to an absorbing set.
Resumo:
Thrust fault earthquakes are investigated in the laboratory by generating dynamic shear ruptures along pre-existing frictional faults in rectangular plates. A considerable body of evidence suggests that dip-slip earthquakes exhibit enhanced ground motions in the acute hanging wall wedge as an outcome of broken symmetry between hanging and foot wall plates with respect to the earth surface. To understand the physical behavior of thrust fault earthquakes, particularly ground motions near the earth surface, ruptures are nucleated in analog laboratory experiments and guided up-dip towards the simulated earth surface. The transient slip event and emitted radiation mimic a natural thrust earthquake. High-speed photography and laser velocimeters capture the rupture evolution, outputting a full-field view of photo-elastic fringe contours proportional to maximum shearing stresses as well as continuous ground motion velocity records at discrete points on the specimen. Earth surface-normal measurements validate selective enhancement of hanging wall ground motions for both sub-Rayleigh and super-shear rupture speeds. The earth surface breaks upon rupture tip arrival to the fault trace, generating prominent Rayleigh surface waves. A rupture wave is sensed in the hanging wall but is, however, absent from the foot wall plate: a direct consequence of proximity from fault to seismometer. Signatures in earth surface-normal records attenuate with distance from the fault trace. Super-shear earthquakes feature greater amplitudes of ground shaking profiles, as expected from the increased tectonic pressures required to induce super-shear transition. Paired stations measure fault parallel and fault normal ground motions at various depths, which yield slip and opening rates through direct subtraction of like components. Peak fault slip and opening rates associated with the rupture tip increase with proximity to the fault trace, a result of selective ground motion amplification in the hanging wall. Fault opening rates indicate that the hanging and foot walls detach near the earth surface, a phenomenon promoted by a decrease in magnitude of far-field tectonic loads. Subsequent shutting of the fault sends an opening pulse back down-dip. In case of a sub-Rayleigh earthquake, feedback from the reflected S wave re-ruptures the locked fault at super-shear speeds, providing another mechanism of super-shear transition.
Resumo:
Motivated by recent MSL results where the ablation rate of the PICA heatshield was over-predicted, and staying true to the objectives outlined in the NASA Space Technology Roadmaps and Priorities report, this work focuses on advancing EDL technologies for future space missions.
Due to the difficulties in performing flight tests in the hypervelocity regime, a new ground testing facility called the vertical expansion tunnel is proposed. The adverse effects from secondary diaphragm rupture in an expansion tunnel may be reduced or eliminated by orienting the tunnel vertically, matching the test gas pressure and the accelerator gas pressure, and initially separating the test gas from the accelerator gas by density stratification. If some sacrifice of the reservoir conditions can be made, the VET can be utilized in hypervelocity ground testing, without the problems associated with secondary diaphragm rupture.
The performance of different constraints for the Rate-Controlled Constrained-Equilibrium (RCCE) method is investigated in the context of modeling reacting flows characteristic to ground testing facilities, and re-entry conditions. The effectiveness of different constraints are isolated, and new constraints previously unmentioned in the literature are introduced. Three main benefits from the RCCE method were determined: 1) the reduction in number of equations that need to be solved to model a reacting flow; 2) the reduction in stiffness of the system of equations needed to be solved; and 3) the ability to tabulate chemical properties as a function of a constraint once, prior to running a simulation, along with the ability to use the same table for multiple simulations.
Finally, published physical properties of PICA are compiled, and the composition of the pyrolysis gases that form at high temperatures internal to a heatshield is investigated. A necessary link between the composition of the solid resin, and the composition of the pyrolysis gases created is provided. This link, combined with a detailed investigation into a reacting pyrolysis gas mixture, allows a much needed consistent, and thorough description of many of the physical phenomena occurring in a PICA heatshield, and their implications, to be presented.
Through the use of computational fluid mechanics and computational chemistry methods, significant contributions have been made to advancing ground testing facilities, computational methods for reacting flows, and ablation modeling.
Resumo:
O objetivo do presente estudo clínico é verificar a reprodutibilidade intra e interexaminadores de um critério de diagnóstico de cárie dentária (Nyvad et al. 1999) aplicado na dentição decídua, e avaliar o tempo médio necessário para a realização do exame clínico utilizando o referido critério. O mesmo é baseado na combinação de métodos visuais e táteis e propõe a diferenciação entre lesões ativas e inativas, tanto para lesões cavitadas quanto para não cavitadas. A amostra total consistiu de 80 crianças de três a sete anos de idade, de ambos os sexos, estudantes do Centro Educacional Terra Santa (Petrópolis/ RJ). Os responsáveis assinaram um termo de consentimento livre e esclarecido e o trabalho foi aprovado pelo Comitê de Ética em Pesquisa do HUPE-UERJ. Os exames foram realizados após escovação supervisionada, em consultório odontológico sob iluminação artificial, após 3-5s de secagem com ar comprimido, por dois examinadores treinados pelas autoras do índice e calibrados. As concordâncias intra e interexaminadores foram avaliadas pelo percentual de concordância (%) e pelo teste kappa (k), considerando a superfície dentária como unidade de análise e os seguintes pontos de corte: 1) hígida versus cariada; 2) ativa versus inativa; 3) descontinuidade versus hígida; e 4) cavitada versus hígida. O % e o valor de k para confiabilidade interexaminadores para cada ponto de corte foram: 1) % = 0,97 e k = 0,82 (IC: 0,80 - 0,85); 2) % = 0,98 e k = 0,80 (IC: 0,76 - 0,83); 3) % = 0,99 e k = 0,90 (IC: 0,88 - 0,93); 4) % = 99,0 e k = 0,95 (IC: 0,92 - 0,97). O % e o valor de k para confiabilidade intraexaminador para cada ponto de corte foram: 1) % = 0,98 e k = 0,86 (IC: 0,84 - 0,86); 2) % = 0,99 e k = 0,86 (IC: 0,83 - 0,89); 3) % = 0,99 e k = 0,94 (IC: 0,92 - 0,96); 4) % = 0,99 e k = 0,98 (IC: 0,96 - 0,99). O maior % de discordância (65,3% - 158/242) concentrou-se na diferenciação entre supefícies hígidas e lesões não cavitadas: 33,5% (81/242) entre superfície hígida e lesão não cavitada inativa; 26,0% (63/242), entre superfície hígida e lesão não cavitada ativa; e 5,8% (14/242), entre lesão não cavitada ativa e lesão não cavitada inativa. O tempo necessário para realização do exame clínico foi em média 226,5s (128,53). Conclui-se que o índice apresentou reprodutibilidade variando de substancial à quase perfeita e um tempo de exame viável, mostrando-se consistente e reproduzível para a realização de estudos clínicos de cárie dentária na dentição decídua.
Resumo:
A exposição materna durante o período gestacional a uma dieta restrita em proteínas (LP) prejudica o desenvolvimento do pâncreas endócrino em sua prole e aumenta a susceptibilidade à hipertensão, diabetes e obesidade na vida adulta. Há evidências de que esse fenômeno pode persistir em gerações subsequentes. Objetivou-se avaliar o efeito da restrição proteica sobre o metabolismo da glicose e morfometria pancreática na prole F3 de camundongos ao nascimento e ao desmame. Para tanto, fêmeas virgens de camundongos Suíços (F0) foram acasaladas e receberam dieta normo-proteica (19% de proteína - NP) ou uma dieta isocalórica restrita em proteínas (5% de proteína - LP) durante toda a gravidez. Durante a lactação e o restante do experimento, todos os grupos receberam a dieta NP. Os filhotes machos foram nomeados F1 (NP1 e LP1). As fêmeas F1 e F2 foram acasaladas para produzir F2 e F3 (NP2, LP2, NP3 e LP3), respectivamente. Semanalmente, os filhotes foram pesados e calculada a taxa de crescimento alométrico (log [massa corporal] = log a + log b [idade]). Os animais foram sacrificados nos dias 1 e 21 de idade, a glicemia foi determinada e o pâncreas retirado, pesado e analisado por estereologia e imunofluorescência; a insulina foi mensurada aos 21 dias. Como resultados, os filhotes restritos na primeira geração (LP1) foram menores ao nascer, mas apresentaram um crescimento acelerado nos primeiros sete dias de vida, mostrando catch-up com os controles; a prole LP2 demonstrou a maior massa corporal ao nascimento e tiveram uma taxa de crescimento mais lenta durante a lactação; não houve diferença na massa corporal e na taxa de crescimento na geração F3. A massa de pâncreas foi diminuída em LP1-LP3 ao nascimento, contudo foi aumentada em LP2 ao desmame. A densidade de volume e o diâmetro das ilhotas foram menores em todos os grupos restritos no dia 1 e 21, somente LP1 teve o menor número de ilhotas. Ao nascer, a massa de células beta foi menor em LP1-LP3 e permaneceu baixa durante a lactação. No dia 1 e 21, os filhotes foram normoglicêmicos, entretanto foram hipoinsulinêmicos ao desmame. Portanto, a restrição de proteínas em camundongos durante a gestação produz alterações morfológicas nas ilhotas pancreáticas, sugerindo que a homeostase da glicose foi mantida por um aumento da sensibilidade à insulina durante os primeiros estágios de vida na prole ao longo de três gerações consecutivas.