388 resultados para computer software


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The high-intensity, high-resolution x-ray source at the European Synchrotron Radiation Facility (ESRF) has been used in x-ray diffraction (XRD) experiments to detect intermetallic compounds (IMCs) in lead-free solder bumps. The IMCs found in 95.5Sn3.8Ag0.7Cu solder bumps on Cu pads with electroplated-nickel immersion-gold (ENIG) surface finish are consistent with results based on traditional destructive methods. Moreover, after positive identification of the IMCs from the diffraction data, spatial distribution plots over the entire bump were obtained. These spatial distributions for selected intermetallic phases display the layer thickness and confirm the locations of the IMCs. For isothermally aged solder samples, results have shown that much thicker layers of IMCs have grown from the pad interface into the bulk of the solder. Additionally, the XRD technique has also been used in a temperature-resolved mode to observe the formation of IMCs, in situ, during the solidification of the solder joint. The results demonstrate that the XRD technique is very attractive as it allows for nondestructive investigations to be performed on expensive state-of-the-art electronic components, thereby allowing new, lead-free materials to be fully characterized.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The needs for various forms of information systems relating to the European environment and ecosystem are reviewed, and limitations indicated. Existing information systems are reviewed and compared in terms of aims and functionalities. We consider TWO technical challenges involved in attempting to develop an IEEICS. First, there is the challenge of developing an Internet-based communication system which allows fluent access to information stored in a range of distributed databases. Some of the currently available solutions are considered, i.e. Web service federations. The second main challenge arises from the fact that there is general intra-national heterogeneity in the definitions adopted, and the measurement systems used throughout the nations of Europe. Integrated strategies are needed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Symposium, “Towards the sustainable use of Europe’s forests”, with sub-title “Forest ecosystem and landscape research: scientific challenges and opportunities” lists three fundamental substantive areas of research that are involved: Forest management and practices, Ecosystem processes and functional ecology, and Environmental economics and sociology. This paper argues that there are essential catalytic elements missing! Without these elements there is great danger that the aimed-for world leadership in the forest sciences will not materialize. What are the missing elements? All the sciences, and in particular biology, environmental sciences, sociology, economics, and forestry have evolved so that they include good scientific methodology. Good methodology is imperative in both the design and analysis of research studies, the management of research data, and in the interpretation of research finding. The methodological disciplines of Statistics, Modelling and Informatics (“SMI”) are crucial elements in a proposed Centre of European Forest Science, and the full involvement of professionals in these methodological disciplines is needed if the research of the Centre is to be world-class. Distributed Virtual Institute (DVI) for Statistics, Modelling and Informatics in Forestry and the Environment (SMIFE) is a consortium with the aim of providing world-class methodological support and collaboration to European research in the areas of Forestry and the Environment. It is suggested that DVI: SMIFE should be a formal partner in the proposed Centre for European Forest Science.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper suggests a possible framework for the encapsulation of the decision making process for the Waterime project. The final outcome maybe a computerised model, but the process advocated is not prescriptive, and involves the production of a "paper model" as mediating representation between the knowledge acquired and any computerised system. This paper model may suffice in terms of the project's goals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many Web applications walk the thin line between the need for dynamic data and the need to meet user performance expectations. In environments where funds are not available to constantly upgrade hardware inline with user demand, alternative approaches need to be considered. This paper introduces a ‘Data farming’ model whereby dynamic data, which is ‘grown’ in operational applications, is ‘harvested’ and ‘packaged’ for various consumer markets. Like any well managed agricultural operation, crops are harvested according to historical and perceived demand as inferred by a self-optimising process. This approach aims to make enhanced use of available resources through better utlilisation of system downtime - thereby improving application performance and increasing the availability of key business data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As part of a comprehensive effort to predict the development of caking in granular materials, a mathematical model is introduced to model simultaneous heat and moisture transfer with phase change in porous media when undergoing temperature oscillations/cycling. The resulting model partial differential equations were solved using finite-volume procedures in the context of the PHYSICA framework and then applied to the analysis of sugar in storage. The influence of temperature on absorption/desorption and diffusion coefficients is coupled into the transport equations. The temperature profile, the depth of penetration of the temperature oscillation into the bulk solid, and the solids moisture content distribution were first calculated, and these proved to be in good agreement with experimental data. Then, the influence of temperature oscillation on absolute humidity, moisture concentration, and moisture migration for different parameters and boundary conditions was examined. As expected, the results show that moisture near boundary regions responds faster than farther away from them with surface temperature changes. The moisture absorption and desorption in materials occurs mainly near boundary regions (where interactions with the environment are more pronounced). Small amounts of solids moisture content, driven by both temperature and vapour concentration gradients, migrate between boundary and center with oscillating temperature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is concern in the Cross-Channel region of Nord-Pas-de-Calais (France) and Kent (Great Britain), regarding the extent of atmospheric pollution detected in the area from emitted gaseous (VOC, NOx, S02)and particulate substances. In particular, the air quality of the Cross-Channel or "Trans-Manche" region is highly affected by the heavily industrial area of Dunkerque, in addition to transportation sources linked to cross-channel traffic in Kent and Calais, posing threats to the environment and human health. In the framework of the cross-border EU Interreg IIIA activity, the joint Anglo-French project, ATTMA, has been commissioned to study Aerosol Transport in the Trans-Manche Atmosphere. Using ground monitoring data from UK and French networks and with the assistance of satellite images the project aims to determine dispersion patterns. and identify sources responsible for the pollutants. The findings of this study will increase awareness and have a bearing on future air quality policy in the region. Public interest is evident by the presence of local authorities on both sides of the English Channel as collaborators. The research is based on pollution transport simulations using (a) Lagrangian Particle Dispersion (LPD) models, (b) an Eulerian Receptor Based model. This paper is concerned with part (a), the LPD Models. Lagrangian Particle Dispersion (LPD) models are often used to numerically simulate the dispersion of a passive tracer in the planetary boundary layer by calculating the Lagrangian trajectories of thousands of notional particles. In this contribution, the project investigated the use of two widely used particle dispersion models: the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model and the model FLEXPART. In both models forward tracking and inverse (or·. receptor-based) modes are possible. Certain distinct pollution episodes have been selected from the monitor database EXPER/PF and from UK monitoring stations, and their likely trajectory predicted using prevailing weather data. Global meteorological datasets were downloaded from the ECMWF MARS archive. Part of the difficulty in identifying pollution sources arises from the fact that much of the pollution outside the monitoring area. For example heightened particulate concentrations are to originate from sand storms in the Sahara, or volcanic activity in Iceland or the Caribbean work identifies such long range influences. The output of the simulations shows that there are notable differences between the formulations of and Hysplit, although both models used the same meteorological data and source input, suggesting that the identification of the primary emissions during air pollution episodes may be rather uncertain.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, the buildingEXODUS evacuation model is described and discussed and attempts at qualitative and quantitative model validation are presented. The data sets used for validation are the Stapelfeldt and Milburn House evacuation data. As part of the validation exercise, the sensitivity of the building-EXODUS predictions to a range of variables is examined, including occupant drive, occupant location, exit flow capacity, exit size, occupant response times and geometry definition. An important consideration that has been highlighted by this work is that any validation exercise must be scrutinised to identify both the results generated and the considerations and assumptions on which they are based. During the course of the validation exercise, both data sets were found to be less than ideal for the purpose of validating complex evacuation. However, the buildingEXODUS evacuation model was found to be able to produce reasonable qualitative and quantitative agreement with the experimental data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider a single machine due date assignment and scheduling problem of minimizing holding costs with no tardy jobs tinder series parallel and somewhat wider class of precedence constraints as well as the properties of series-parallel graphs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A simulation program has been developed to calculate the power-spectral density of thin avalanche photodiodes, which are used in optical networks. The program extends the time-domain analysis of the dead-space multiplication model to compute the autocorrelation function of the APD impulse response. However, the computation requires a large amount of memory space and is very time consuming. We describe our experiences in parallelizing the code using both MPI and OpenMP. Several array partitioning schemes and scheduling policies are implemented and tested Our results show that the OpenMP code is scalable up to 64 processors on an SGI Origin 2000 machine and has small average errors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An important factor for high-speed optical communication is the availability of ultrafast and low-noise photodetectors. Among the semiconductor photodetectors that are commonly used in today’s long-haul and metro-area fiber-optic systems, avalanche photodiodes (APDs) are often preferred over p-i-n photodiodes due to their internal gain, which significantly improves the receiver sensitivity and alleviates the need for optical pre-amplification. Unfortunately, the random nature of the very process of carrier impact ionization, which generates the gain, is inherently noisy and results in fluctuations not only in the gain but also in the time response. Recently, a theory characterizing the autocorrelation function of APDs has been developed by us which incorporates the dead-space effect, an effect that is very significant in thin, high-performance APDs. The research extends the time-domain analysis of the dead-space multiplication model to compute the autocorrelation function of the APD impulse response. However, the computation requires a large amount of memory space and is very time consuming. In this research, we describe our experiences in parallelizing the code in MPI and OpenMP using CAPTools. Several array partitioning schemes and scheduling policies are implemented and tested. Our results show that the code is scalable up to 64 processors on a SGI Origin 2000 machine and has small average errors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Johnson's SB distribution is a four-parameter distribution that is transformed into a normal distribution by a logit transformation. By replacing the normal distribution of Johnson's SB with the logistic distribution, we obtain a new distributional model that approximates SB. It is analytically tractable, and we name it the "logitlogistic" (LL) distribution. A generalized four-parameter Weibull model and the Burr XII model are also introduced for comparison purposes. Using the distribution "shape plane" (with axes skew and kurtosis) we compare the "coverage" properties of the LL, the generalized Weibull, and the Burr XII with Johnson's SB, the beta, and the three-parameter Weibull, the main distributions used in forest modelling. The LL is found to have the largest range of shapes. An empirical case study of the distributional models is conducted on 107 sample plots of Chinese fir. The LL performs best among the four-parameter models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A comprehensive solution of solidification/melting processes requires the simultaneous representation of free surface fluid flow, heat transfer, phase change, nonlinear solid mechanics and, possibly, electromagnetics together with their interactions, in what is now known as multiphysics simulation. Such simulations are computationally intensive and the implementation of solution strategies for multiphysics calculations must embed their effective parallelization. For some years, together with our collaborators, we have been involved in the development of numerical software tools for multiphysics modeling on parallel cluster systems. This research has involved a combination of algorithmic procedures, parallel strategies and tools, plus the design of a computational modeling software environment and its deployment in a range of real world applications. One output from this research is the three-dimensional parallel multiphysics code, PHYSICA. In this paper we report on an assessment of its parallel scalability on a range of increasingly complex models drawn from actual industrial problems, on three contemporary parallel cluster systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The pseudo-spectral solution method offers a flexible and fast alternative to the more usual finite element/volume/difference methods, particularly when the long-time transient behaviour of a system is of interest. Since the exact solution is obtained at the grid collocation points superior accuracy can be achieved on modest grid resolution. Furthermore, the grid can be freely adapted with time and in space, to particular flow conditions or geometric variations. This is especially advantageous where strongly coupled, time-dependent, multi-physics solutions are investigated. Examples include metallurgical applications involving the interaction of electromagnetic fields and conducting liquids with a free sutface. The electromagnetic field then determines the instantaneous liquid volume shape and the liquid shape affects in turn the electromagnetic field. In AC applications a thin "skin effect" region results on the free surface that dominates grid requirements. Infinitesimally thin boundary cells can be introduced using Chebyshev polynomial expansions without detriment to the numerical accuracy. This paper presents a general methodology of the pseudo-spectral approach and outlines the solution procedures used. Several instructive example applications are given: the aluminium electrolysis MHD problem, induction melting and stirring and the dynamics of magnetically levitated droplets in AC and DC fields. Comparisons to available analytical solutions and to experimental measurements will be discussed.