954 resultados para MeSH
Resumo:
This paper investigates the use of the acoustic emission (AE) monitoring technique for use in identifying the damage mechanisms present in paper associated with its production process. The microscopic structure of paper consists of a random mesh of paper fibres connected by hydrogen bonds. This implies the existence of two damage mechanisms, the failure of a fibre-fibre bond and the failure of a fibre. This paper describes a hybrid mathematical model which couples the mechanics of the mass-spring model to the acoustic wave propagation model for use in generating the acoustic signal emitted by complex structures of paper fibres under strain. The derivation of the mass-spring model can be found in [1,2], with details of the acoustic wave equation found in [3,4]. The numerical implementation of the vibro-acoustic model is discussed in detail with particular emphasis on the damping present in the numerical model. The hybrid model uses an implicit solver which intrinsically introduces artificial damping to the solution. The artificial damping is shown to affect the frequency response of the mass-spring model, therefore certain restrictions on the simulation time step must be enforced so that the model produces physically accurate results. The hybrid mathematical model is used to simulate small fibre networks to provide information on the acoustic response of each damage mechanism. The simulated AEs are then analysed using a continuous wavelet transform (CWT), described in [5], which provides a two dimensional time-frequency representation of the signal. The AEs from the two damage mechanisms show different characteristics in the CWT so that it is possible to define a fibre-fibre bond failure by the criteria listed below. The dominant frequency components of the AE must be at approximately 250 kHz or 750 kHz. The strongest frequency component may be at either approximately 250 kHz or 750 kHz. The duration of the frequency component at approximately 250 kHz is longer than that of the frequency component at approximately 750 kHz. Similarly, the criteria for identifying a fibre failure are given below. The dominant frequency component of the AE must be greater than 800 kHz. The duration of the dominant frequency component must be less than 5.00E-06 seconds. The dominant frequency component must be present at the front of the AE. Essentially, the failure of a fibre-fibre bond produces a low frequency wave and the failure of a fibre produces a high frequency pulse. Using this theoretical criteria, it is now possible to train an intelligent classifier such as the Self-Organising Map (SOM) [6] using the experimental data. First certain features must be extracted from the CWTs of the AEs for use in training the SOM. For this work, each CWT is divided into 200 windows of 5E-06s in duration covering a 100 kHz frequency range. The power ratio for each windows is then calculated and used as a feature. Having extracted the features from the AEs, the SOM can now be trained, but care is required so that the both damage mechanisms are adequately represented in the training set. This is an issue with paper as the failure of the fibre-fibre bonds is the prevalent damage mechanism. Once a suitable training set is found, the SOM can be trained and its performance analysed. For the SOM described in this work, there is a good chance that it will correctly classify the experimental AEs.
Resumo:
Solder constitutive models are important as they are widely used in FEA simulations to predict the lifetime of soldered assemblies. This paper briefly reviews some common constitutive laws to capture creep in solder and presents work on laws capturing both kinematic hardening and damage. Inverse analysis is used to determine constants for the kinematic hardening law which match experimental creep curves. The mesh dependence of the damage law is overcome by using volume averaging and is applied to predict the crack path in a thermal cycled resistor component
Resumo:
Light has the greatest information carrying potential of all the perceivable interconnect mediums; consequently, optical fiber interconnects rapidly replaced copper in telecommunications networks, providing bandwidth capacity far in excess of its predecessors. As a result the modern telecommunications infrastructure has evolved into a global mesh of optical networks with VCSEL’s (Vertical Cavity Surface Emitting Lasers) dominating the short-link markets, predominately due to their low-cost. This cost benefit of VCSELs has allowed optical interconnects to again replace bandwidth limited copper as bottlenecks appear on VSR (Very Short Reach) interconnects between co-located equipment inside the CO (Central-Office). Spurred by the successful deployment in the VSR domain and in response to both intra-board backplane applications and inter-board requirements to extend the bandwidth between IC’s (Integrated Circuits), current research is migrating optical links toward board level USR (Ultra Short Reach) interconnects. Whilst reconfigurable Free Space Optical Interconnect (FSOI) are an option, they are complicated by precise line-of-sight alignment conditions hence benefits exist in developing guided wave technologies, which have been classified into three generations. First and second generation technologies are based upon optical fibers and are both capable of providing a suitable platform for intra-board applications. However, to allow component assembly, an integral requirement for inter-board applications, 3rd generation Opto-Electrical Circuit Boards (OECB’s) containing embedded waveguides are desirable. Currently, the greatest challenge preventing the deployment of OECB’s is achieving the out-of-plane coupling to SMT devices. With the most suitable low-cost platform being to integrate the optics into the OECB manufacturing process, several research avenues are being explored although none to date have demonstrated sufficient coupling performance. Once in place, the OECB assemblies will generate new reliability issues such as assembly configurations, manufacturing tolerances, and hermetic requirements that will also require development before total off-chip photonic interconnection can truly be achieved
Resumo:
Developing temperature fields in frozen cheese sauce undergoing microwave heating were simulated and measured. Two scenarios were investigated: a centric and offset placement on the rotating turntable. Numerical modeling was performed using a dedicated electromagnetic Finite Difference Time Domain (FDTD) module that was two-way coupled to the PHYSICA multiphysics package. Two meshes were used: the food material and container were meshed for the heat transfer and the microwave oven cavity and waveguide were meshed for the microwave field. Power densities obtained on the structured FDTD mesh were mapped onto the unstructured finite volume method mesh for each time-step/turntable position. On heating for each specified time-step the temperature field was mapped back onto the FDTD mesh and the electromagnetic properties were updated accordingly. Changes in thermal/electric properties associated with the phase transition were fully accounted for as well as heat losses from product to cavity. Detailed comparisons were carried out for the centric and offset placements, comparing experimental temperature profiles during microwave thawing with those obtained by numerical simulation.
Resumo:
In this chapter we look at JOSTLE, the multilevel graph-partitioning software package, and highlight some of the key research issues that it addresses. We first outline the core algorithms and place it in the context of the multilevel refinement paradigm. We then look at issues relating to its use as a tool for parallel processing and, in particular, partitioning in parallel. Since its first release in 1995, JOSTLE has been used for many mesh-based parallel scientific computing applications and so we also outline some enhancements such as multiphase mesh-partitioning, heterogeneous mapping and partitioning to optimise subdomain shape
Resumo:
At present the vast majority of Computer-Aided- Engineering (CAE) analysis calculations for microelectronic and microsystems technologies are undertaken using software tools that focus on single aspects of the physics taking place. For example, the design engineer may use one code to predict the airflow and thermal behavior of an electronic package, then another code to predict the stress in solder joints, and then yet another code to predict electromagnetic radiation throughout the system. The reason for this focus of mesh-based codes on separate parts of the governing physics is essentially due to the numerical technologies used to solve the partial differential equations, combined with the subsequent heritage structure in the software codes. Using different software tools, that each requires model build and meshing, leads to a large investment in time, and hence cost, to undertake each of the simulations. During the last ten years there has been significant developments in the modelling community around multi- physics analysis. These developments are being followed by many of the code vendors who are now providing multi-physics capabilities in their software tools. This paper illustrates current capabilities of multi-physics technology and highlights some of the future challenges
Resumo:
Thawing of a frozen food product in a domestic microwave oven is numerically simulated using a coupled solver approach. The approach consists of a dedicated electromagnetic FDTD solver and a closely coupled UFVM multi-physics package. Two overlapping numerical meshes are defined; the food material and container were meshed for heat transfer and phase change solution, whilst the microwave oven cavity and waveguide were meshed for the microwave irradiation. The two solution domains were linked using a cross-mapping routine. This approach allowed the rotation of the food load to be captured. Power densities obtained on the structured FDTD mesh were interpolated onto the UFVM mesh for each timestep/turntable position. The UFVM solver utilised the power density data to advance the temperature and phase distribution solution. The temperature-dependant dielectric and thermo-physical properties of the food load were updated prior to revising the electromagnetic solution. Changes in thermal/electric properties associated with the phase transition were fully accounted for as well as heat losses from product to cavity. Two scenarios were investigated: a centric and eccentric placement on the turntable. Developing temperature fields predicted by the numerical solution are validated against experimentally obtained data. Presented results indicate the feasibility of fully coupled simulations of the microwave heating of a frozen product. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
A numerical modelling method for the analysis of solder joint damage and crack propagation has been described in this paper. The method is based on the disturbed state concept. Under cyclic thermal-mechanical loading conditions, the level of damage that occurs in solder joints is assumed to be a simple monotonic scalar function of the accumulated equivalent plastic strain. The increase of damage leads to crack initiation and propagation. By tracking the evolution of the damage level in solder joints, crack propagation path and rate can be simulated using Finite Element Analysis method. The discussions are focused on issues in the implementation of the method. The technique of speeding up the simulation and the mesh dependency issues are analysed. As an example of the application of this method, crack propagation in solder joints of power electronics modules under cyclic thermal-mechanical loading conditions has been analyzed and the predicted cracked area size after 3000 loading cycles is consistent with experimental results.
Resumo:
Financial modelling in the area of option pricing involves the understanding of the correlations between asset and movements of buy/sell in order to reduce risk in investment. Such activities depend on financial analysis tools being available to the trader with which he can make rapid and systematic evaluation of buy/sell contracts. In turn, analysis tools rely on fast numerical algorithms for the solution of financial mathematical models. There are many different financial activities apart from shares buy/sell activities. The main aim of this chapter is to discuss a distributed algorithm for the numerical solution of a European option. Both linear and non-linear cases are considered. The algorithm is based on the concept of the Laplace transform and its numerical inverse. The scalability of the algorithm is examined. Numerical tests are used to demonstrate the effectiveness of the algorithm for financial analysis. Time dependent functions for volatility and interest rates are also discussed. Applications of the algorithm to non-linear Black-Scholes equation where the volatility and the interest rate are functions of the option value are included. Some qualitative results of the convergence behaviour of the algorithm is examined. This chapter also examines the various computational issues of the Laplace transformation method in terms of distributed computing. The idea of using a two-level temporal mesh in order to achieve distributed computation along the temporal axis is introduced. Finally, the chapter ends with some conclusions.
Resumo:
The Continuous Plankton Recorder (CPR) survey has been used to characterize phytoplankton and zooplankton space-time dynamics in the North Sea since 1931 and in the North Atlantic since 1939. Phytoplankton biomass is assessed from these samples by visual assessment of the green color of the silk mesh, the Phytoplankton Color Index (PCI), and the total count of diatoms and dinoflagellates. Species with a frequency of occurrence greater than 1% in the samples are used as indicator species of the community. We investigated (1) long-term fluctuations of phytoplankton biomass, total diatoms, and total dinoflagellates; (2) geographical variation of patterns; (3) the relationship between phytoplankton and climate forcing in the North Atlantic CPR samples; (4) the relative contribution of diatoms and dinoflagellates to the PCI; and (5) the fluctuations of the dominant species over the period of survey to provide more information on the processes linking climate to changes in the phytoplankton community. As a result of the differences in microscopic analysis methods prior to 1958, our analyses were conducted for the period ranging from 1958 to 2002. The North Atlantic was divided into six regions identified through bathymetric criteria and separated along a North-South axis. Based on 12 monthly time series, we demonstrate increasing trends in PCI and total dinoflagellates and a decrease in total diatoms.
Resumo:
During the 1970s and 1980s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work (Phase 1 of the study reported here: Oakley & Hiscock, 2005). Some image analysis was undertaken as a part of Phase 1. Phase 2 (this report) was to further analyse selected images. Having determined in Phase 1 that only the 35 mm photographic transparencies provided sufficient clarity to identify species and biotopes, the tows selected for analysis were ones where 35mm images had been taken. The tows selected for analysis of images were mainly in the vicinity of Plymouth and especially along the area between Rame Head and the region of the Eddystone. The 35 mm films were viewed under a binocular microscope and the taxa that could be recognised recorded in note form. Twenty-five images were selected for inclusion in the report. Almost all of the images were of level sediment seabed. Where rocks were included, it was usually unplanned and the sled was hauled before being caught or damaged. The main biotopes or biotope complexes identified were: SS.SMU.CSaMu. Circalittoral sandy mud. Extensively present between the shore and the Eddystone Reef complex and at depths of about 48 to 52 m. At one site offshore of Plymouth Sound, the turret shell Turritella communis was abundant. In some areas, this biotope had dense anemones, Mesacmaea mitchelli and (more rarely) Cerianthus lloydii. Queen scallops, Aequipecten opercularis and king scallops, Pecten maximus, were sometimes present in small numbers. Hard substratum species such as hydroids, dead mens fingers Alcyonium digitatum and the cup coral Caryophyllia smithii occurred in a few places, probably attached to shells or stones beneath the surface. South of the spoil ground off Hilsea Point at 57m depth, the sediment was muddier but is still assigned to this biotope complex. It is notable that three small sea pens, most likely Virgularia mirabilis, were seen here. SS.SMx.CMx. Circalittoral mixed sediment. Further offshore but at about the same depth as SS.SMU.CSaMu occurred, coarse gravel with some silt was present. The sediment was characterised must conspicuously by small queen scallops, Aequipecten opercularis. Peculiarly, there were ‘bundles’ of the branching bryozoan Cellaria sp. – a species normally found attached to rock. It could not be seen whether these bundles of Cellaria had been brought-together by terebellid worms but it is notable that Cellaria is recorded in historical surveys. As with many other sediments, there were occasional brittle stars, Ophiocomina nigra and Ophiura ophiura. Where sediments were muddy, the burrowing anemone Mesacmaea mitchelli was common. Where pebbles or cobbles occurred, there were attached species such as Alcyonium digitatum, Caryophyllia smithii and the fleshy bryozoan Alcyonidium diaphanum. Undescribed biotope. Although most likely a part of SS.SMx.CMx, the biotope visually dominated by a terebellid worm believed to be Thelepus cincinnatua, is worth special attention as it may be an undescribed biotope. The biotope occurred about 22 nautical miles south of the latitude of the Eddystone and in depths in excess of 70 m. SS.SCS.CCS.Blan. Branchiostoma lanceolatum in circalittoral coarse sand with shell gravel at about 48m depth and less. This habitat was the ‘classic’ ‘Eddystone Shell Gravel’ which is sampled for Branchiostoma lanceolatum. However, no Branchiostoma lanceolatum could be seen. The gravel was almost entirely bare of epibiota. There were occasional rock outcrops or cobbles which had epibiota including encrusting calcareous algae, the sea fan Eunicella verrucosa, cup corals, Caryophyllia smithii, hydroids and a sea urchin Echinus esculentus. The variety of species visible on the surface is small and therefore identification to biotope not usually possible. Historical records from sampling surveys that used grabs and dredges at the end of the 19th century and early 20th century suggest similar species present then. Illustrations of some of the infaunal communities from work in the 1920’s is included in this report to provide a context to the epifaunal photographs.
Resumo:
During the 1970’s and 1980’s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A scoping study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work. The results of the scoping study are: 1. NMBL archives hold 106 videotapes (reel-to-reel Sony HD format) and 59 video cassettes (including 15 from the Irish Sea) in VHS format together with 90 rolls of 35 mm colour transparency film (various lengths up to about 240 frames per film). These are stored in the Archive Room, either in a storage cabinet or in original film canisters. 2. Reel-to-reel material is extensive and had already been selectively copied to VHS cassettes. The cost of transferring it to an accepted ‘long-life’ medium (Betamax) would be approximately £15,000. It was not possible to view the tapes as a suitable machine was not located. The value of the tapes is uncertain but they are likely to become beyond salvation within one to two years. 3. Video cassette material is in good condition and is expected to remain so for several more years at least. Images viewed were generally of poor quality and the speed of tow often makes pictures blurred. No immediate action is required. 4. Colour transparency films are in good condition and the images are very clear. They provide the best source of information for mapping seabed biotopes. They should be scanned to digital format but inexpensive fast copying is problematic as there are no between-frame breaks between images and machines need to centre the image based on between-frame breaks. The minimum cost to scan all of the images commercially is approximately £6,000 and could be as much as £40,000 on some quotations. There is a further cost in coding and databasing each image and, all-in-all it would seem most economic to purchase a ‘continuous film’ scanner and undertake the work in-house. 5. Positional information in ships logs has been matched to films and to video tapes. Decca Chain co-ordinates recorded in the logbooks have been converted to latitude and longitude (degrees, minutes and seconds) and a further routine developed to convert to degrees and decimal degrees required for GIS mapping. However, it is unclear whether corrections to Decca positions were applied at the time the position was noted. Tow tracks have been mapped onto an electronic copy of a Hydrographic Office chart. 6. The positions of start and end of each tow were entered to a spread sheet so that they can be displayed on GIS or on a Hydrographic Office Chart backdrop. The cost of the Hydrographic Office chart backdrop at a scale of 1:75,000 for the whole area was £458 incl. VAT. 7. Viewing all of the video cassettes to note habitats and biological communities, even by an experienced marine biologist, would take at least in the order of 200 hours and is not recommended. English Channel towed sledge seabed images. Phase 1: scoping study and example analysis. 6 8. Once colour transparencies are scanned and indexed, viewing to identify seabed habitats and biological communities would probably take about 100 hours for an experienced marine biologist and is recommended. 9. It is expected that identifying biotopes along approximately 1 km lengths of each tow would be feasible although uncertainties about Decca co-ordinate corrections and exact positions of images most likely gives a ±250 m position error. More work to locate each image accurately and solve the Decca correction question would improve accuracy of image location. 10. Using codings (produced by Holme to identify different seabed types), and some viewing of video and transparency material, 10 biotopes have been identified, although more would be added as a result of full analysis. 11. Using the data available from the Holme archive, it is possible to populate various fields within the Marine Recorder database. The overall ‘survey’ will be ‘English Channel towed video sled survey’. The ‘events’ become the 104 tows. Each tow could be described as four samples, i.e. the start and end of the tow and two areas in the middle to give examples along the length of the tow. These samples would have their own latitude/longitude co-ordinates. The four samples would link to a GIS map. 12. Stills and video clips together with text information could be incorporated into a multimedia presentation, to demonstrate the range of level seabed types found along a part of the northern English Channel. More recent images taken during SCUBA diving of reef habitats in the same area as the towed sledge surveys could be added to the Holme images.
Resumo:
Meiofaunal organisms are mobile multicellular animals that are smaller than macrofauna and larger than microfauna. The size boundaries of meiofauna are generally based on the standardised mesh apertures of sieves with 500 μm (or 1000 μm) as upper and 63 μm (or 42 μm) as lower limits. Meiofauna are ubiquitous, inhabiting most marine substrata, often in high densities. Meiofauna are highly diverse, and several phyla are only known to occur as meiofauna. Owing to their small size and high densities, specialised techniques are required to collect, preserve and examine meiofauna. These are described, along with approaches to determine biomass of these small animals. Their small size also makes them useful candidates for manipulative experiments, and culturing of individual species and approaches to experiments on whole communities are briefly discussed.
Resumo:
The Continuous Plankton Recorder (CPR) survey of the North Pacific is a PICES project now in its ninth year and facing an uncertain future. CPRs have been towed behind commercial ships along two (north–south and east–west) transects for a total of ~ nine times per year. Samples are collected with a filtering mesh and are then microscopically processed for plankton abundance in the laboratory. The survey, so far, has accumulated 3,648 processed samples (with approximately three times as many archived without processing), each representing 18 km of the transect (Fig. 1) and containing an abundance of data on over 290 phytoplankton and zooplankton taxa. A CTD with a fluorometer has been attached to the CPR sampling at the east–west transect in more recent years to provide supplementary environmental data.
Resumo:
Interannual and seasonal trends of zooplankton abundance and species composition were compared between the Bongo net and Continuous Plankton Recorder (CPR) time series in the Gulf of Maine. Data from 5799 Bongo and 3118 CPR samples were compared from the years 1978–2006. The two programs use different sampling methods, with the Bongo time series composed of bimonthly vertically integrated samples from locations throughout the region, while the CPR was towed monthly at 10 m depth on a transect that bisects the region. It was found that there was a significant correlation between the interannual (r = 0.67, P < 0.01) and seasonal (r = 0.95, P < 0.01) variability of total zooplankton counts. Abundance rankings of individual taxa were highly correlated and temporal trends of dominant copepods were similar between samplers. Multivariate analysis also showed that both time series equally detected major shifts in community structure through time. However, absolute abundance levels were higher in the Bongo and temporal patterns for many of the less abundant taxa groups were not similar between the two devices. The different mesh sizes of the samplers probably caused some of the discrepancies; but diel migration patterns, damage to soft bodied animals and avoidance of the small CPR aperture by some taxa likely contributed to the catch differences between the two devices. Nonetheless, Bongo data presented here confirm the previously published patterns found in the CPR data set, and both show that the abundance increase of the 1990s has been followed by average to below average levels from 2002 to 06.