228 resultados para Parallel processing (Electronic computers)


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the results of a packaging process based on the stencil printing of isotropic conductive adhesives (ICAs) that form the interconnections of flip-chip bonded electronic packages. Ultra-fine pitch (sub-100-mum), low temperature (100degC), and low cost flip-chip assembly is demonstrated. The article details recent advances in electroformed stencil manufacturing that use microengineering techniques to enable stencil fabrication at apertures sizes down to 20mum and pitches as small as 30mum. The current state of the art for stencil printing of ICAs and solder paste is limited between 150-mum and 200-mum pitch. The ICAs-based interconnects considered in this article have been stencil printed successfully down to 50-mum pitch with consistent printing demonstrated at 90-mum pitch size. The structural integrity or the stencil after framing and printing is also investigated through experimentation and computational modeling. The assembly of a flip-chip package based on copper column bumped die and ICA deposits stencil printed at sub-100-mum pitch is described. Computational fluid dynamics modeling of the print performance provides an indicator on the optimum print parameters. Finally, an organic light emitting diode display chip is packaged using this assembly process

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The passenger response time distributions adopted by the International Maritime Organisation (IMO)in their assessment of the assembly time for passanger ships involves two key assumptions. The first is that the response time distribution assumes the form of a uniform random distribution and the second concerns the actual response times. These two assumptions are core to the validity of the IMO analysis but are not based on real data, being the recommendations of an IMO committee. In this paper, response time data collected from assembly trials conducted at sea on a real passanger vessel using actual passangers are presented and discussed. Unlike the IMO specified response time distributions, the data collected from these trials displays a log-normal distribution, similar to that found in land based environments. Based on this data, response time distributions for use in the IMO assesmbly for the day and night scenarios are suggested

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a numerical study of the Reynolds number and scaling effects in microchannel flows. The configuration includes a rectangular, high-aspect ratio microchannel with heat sinks, similar to an experimental setup. Water at ambient temperature is used as a coolant fluid and the source of heating is introduced via electronic cartridges in the solids. Two channel heights, measuring 0.3 mm and 1 mm are considered at first. The Reynolds number varies in a range of 500-2200, based on the hydraulic diameter. Simulations are focused on the Reynolds number and channel height effects on the Nusselt number. It is found that the Reynolds number has noticeable influences on the local Nusselt number distributions, which are in agreement with other studies. The numerical predictions of the dimensionless temperature of the fluid agree fairly well with experimental measurements; however the dimensionless temperature of the solid does exhibit a significant discrepancy near the channel exit, similar to those reported by other researchers. The present study demonstrates that there is a significant scaling effect at small channel height, typically 0.3 mm, in agreement with experimental observations. This scaling effect has been confirmed by three additional simulations being carried out at channel heights of 0.24 mm, 0.14 mm and 0.1 mm, respectively. A correlation between the channel height and the normalized Nusselt number is thus proposed, which agrees well with results presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A natural approach to representing and reasoning about temporal propositions (i.e., statements with time-dependent truth-values) is to associate them with time elements. In the literature, there are three choices regarding the primitive for the ontology of time: (1) instantaneous points, (2) durative intervals and (3) both points and intervals. Problems may arise when one conflates different views of temporal structure and questions whether some certain types of temporal propositions can be validly and meaningfully associated with different time elements. In this paper, we shall summarize an ontological glossary with respect to time elements, and diversify a wider range of meta-predicates for ascribing temporal propositions to time elements. Based on these, we shall also devise a versatile categorization of temporal propositions, which can subsume those representative categories proposed in the literature, including that of Vendler, of McDermott, of Allen, of Shoham, of Galton and of Terenziani and Torasso. It is demonstrated that the new categorization of propositions, together with the proposed range of meta-predicates, provides the expressive power for modeling some typical temporal terms/phenomena, such as starting-instant, stopping-instant, dividing-instant, instigation, termination and intermingling etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes work towards the deployment of self-managing capabilities into an advanced middleware for automotive systems. The middleware will support a range of futuristic use-cases requiring context-awareness and dynamic system configuration. Several use-cases are described and their specific context-awareness requirements identified. The discussion is accompanied by a justification for the selection of policy-based computing as the autonomics technique to drive the self-management. The specific policy technology to be deployed is described briefly, with a focus on its specific features that are of direct relevance to the middleware project. A selected use-case is explored in depth to illustrate the extent of dynamic behaviour achievable in the proposed middleware architecture, which is composed of several policy-configured services. An early demonstration application which facilitates concept evaluation is presented and a sequence of typical device-discovery events is worked through

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimisation in wireless sensor networks is necessary due to the resource constraints of individual devices, bandwidth limits of the communication channel, relatively high probably of sensor failure, and the requirement constraints of the deployed applications in potently highly volatile environments. This paper presents BioANS, a protocol designed to optimise a wireless sensor network for resource efficiency as well as to meet a requirement common to a whole class of WSN applications - namely that the sensor nodes are dynamically selected on some qualitative basis, for example the quality by which they can provide the required context information. The design of BioANS has been inspired by the communication mechanisms that have evolved in natural systems. The protocol tolerates randomness in its environment, including random message loss, and incorporates a non-deterministic ’delayed-bids’ mechanism. A simulation model is used to explore the protocol’s performance in a wide range of WSN configurations. Characteristics evaluated include tolerance to sensor node density and message loss, communication efficiency, and negotiation latency .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a protocol for dynamically configuring wireless sensor nodes into logical clusters. The concept is to be able to inject an overlay configuration into an ad-hoc network of sensor nodes or similar devices, and have the network configure itself organically. The devices are arbitrarily deployed and have initially have no information whatsoever concerning physical location, topology, density or neighbourhood. The Emergent Cluster Overlay (ECO) protocol is totally self-configuring and has several novel features, including nodes self-determining their mobility based on patterns of neighbour discovery, and that the target cluster size is specified externally (by the sensor network application) and is not directly coupled to radio communication range or node packing density. Cluster head nodes are automatically assigned as part of the cluster configuration process, at no additional cost. ECO is ideally suited to applications of wireless sensor networks in which localized groups of sensors act cooperatively to provide a service. This includes situations where service dilution is used (dynamically identifying redundant nodes to conserve their resources).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Logit-Logistic (LL), Johnson's SB, and the Beta (GBD) are flexible four-parameter probability distribution models in terms of the (skewness-kurtosis) region covered, and each has been used for modeling tree diameter distributions in forest stands. This article compares bivariate forms of these models in terms of their adequacy in representing empirical diameter-height distributions from 102 sample plots. Four bivariate models are compared: SBB, the natural, well-known, and much-used bivariate generalization of SB; the bivariate distributions with LL, SB, and Beta as marginals, constructed using Plackett's method (LL-2P, etc.). All models are fitted using maximum likelihood, and their goodness-of-fits are compared using minus log-likelihood (equivalent to Akaike's Information Criterion, the AIC). The performance ranking in this case study was SBB, LL-2P, GBD-2P, and SB-2P

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-Lagrange time integration is used with the finite difference method to provide accurate stable prices for Asian options, with or without early exercise. These are combined with coordinate transformations for computational efficiency and compared with published results

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the findings of a study on improving interaction design for visually impaired students, focusing upon the cognitive criteria for information visualisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper will analyse two of the likely damage mechanisms present in a paper fibre matrix when placed under controlled stress conditions: fibre/fibre bond failure and fibre failure. The failure process associated with each damage mechanism will be presented in detail focusing on the change in mechanical and acoustic properties of the surrounding fibre structure before and after failure. To present this complex process mathematically, geometrically simple fibre arrangements will be chosen based on certain assumptions regarding the structure and strength of paper, to model the damage mechanisms. The fibre structures are then formulated in terms of a hybrid vibro-acoustic model based on a coupled mass/spring system and the pressure wave equation. The model will be presented in detail in the paper. The simulation of the simple fibre structures serves two purposes; it highlights the physical and acoustic differences of each damage mechanism before and after failure, and also shows the differences in the two damage mechanisms when compared with one another. The results of the simulations are given in the form of pressure wave contours, time-frequency graphs and the Continuous Wavelet Transform (CWT) diagrams. The analysis of the results leads to criteria by which the two damage mechanisms can be identified. Using these criteria it was possible to verify the results of the simulations against experimental acoustic data. The models developed in this study are of specific practical interest in the paper-making industry, where acoustic sensors may be used to monitor continuous paper production. The same techniques may be adopted more generally to correlate acoustic signals to damage mechanisms in other fibre-based structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001. In particular the paper describes the development of the High-rise Evacuation Evaluation Database (HEED). This is a flexible qualitative research tool which contains the full transcribed interview accounts and coded evacuee experiences extracted from those transcripts. The data and information captured and stored in the HEED database is not only unique, but it provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the use of computer modelled versus directly experimentally determined fire hazard data for assessing survivability within buildings using evacuation models incorporating Fractionally Effective Dose (FED) models. The objective is to establish a link between effluent toxicity, measured using a variety of small and large scale tests, and building evacuation. For the scenarios under consideration, fire simulation is typically used to determine the time non-survivable conditions develop within the enclosure, for example, when smoke or toxic effluent falls below a critical height which is deemed detrimental to evacuation or when the radiative fluxes reach a critical value leading to the onset of flashover. The evacuation calculation would the be used to determine whether people within the structure could evacuate before these critical conditions develop.