509 resultados para Feeder reconfigurations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The collect-and-place machine is one of the most widely used placement machines for assembling electronic components on the printed circuit boards (PCBs). Nevertheless, the number of researches concerning the optimisation of the machine performance is very few. This motivates us to study the component scheduling problem for this type of machine with the objective of minimising the total assembly time. The component scheduling problem is an integration of the component sequencing problem, that is, the sequencing of component placements; and the feeder arrangement problem, that is, the assignment of component types to feeders. To solve the component scheduling problem efficiently, a hybrid genetic algorithm is developed in this paper. A numerical example is used to compare the performance of the algorithm with different component grouping approaches and different population sizes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A chip shooter machine for electronic component assembly has a movable feeder carrier, a movable X–Y table carrying a printed circuit board (PCB), and a rotary turret with multiple assembly heads. This paper presents a hybrid genetic algorithm (HGA) to optimize the sequence of component placements and the arrangement of component types to feeders simultaneously for a chip shooter machine, that is, the component scheduling problem. The objective of the problem is to minimize the total assembly time. The GA developed in the paper hybridizes different search heuristics including the nearest-neighbor heuristic, the 2-opt heuristic, and an iterated swap procedure, which is a new improved heuristic. Compared with the results obtained by other researchers, the performance of the HGA is superior in terms of the assembly time. Scope and purpose When assembling the surface mount components on a PCB, it is necessary to obtain the optimal sequence of component placements and the best arrangement of component types to feeders simultaneously in order to minimize the total assembly time. Since it is very difficult to obtain the optimality, a GA hybridized with several search heuristics is developed. The type of machines being studied is the chip shooter machine. This paper compares the algorithm with a simple GA. It shows that the performance of the algorithm is superior to that of the simple GA in terms of the total assembly time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A chip shooter machine in printed circuit board (PCB) assembly has three movable mechanisms: an X-Y table carrying a PCB, a feeder carrier with several feeders holding components and a rotary turret with multiple assembly heads to pick up and place components. In order to get the minimal placement or assembly time for a PCB on the machine, all the components on the board should be placed in a perfect sequence, and the components should be set up on a right feeder, or feeders since two feeders can hold the same type of components, and additionally, the assembly head should retrieve or pick up a component from a right feeder. The entire problem is very complicated, and this paper presents a genetic algorithm approach to tackle it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In printed circuit board (PCB) assembly, the efficiency of the component placement process is dependent on two interrelated issues: the sequence of component placement, that is, the component sequencing problem, and the assignment of component types to feeders of the placement machine, that is, the feeder arrangement problem. In cases where some components with the same type are assigned to more than one feeder, the component retrieval problem should also be considered. Due to their inseparable relationship, a hybrid genetic algorithm is adopted to solve these three problems simultaneously for a type of PCB placement machines called the sequential pick-and-place (PAP) machine in this paper. The objective is to minimise the total distance travelled by the placement head for assembling all components on a PCB. Besides, the algorithm is compared with the methods proposed by other researchers in order to examine its effectiveness and efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A chip shooter machine for electronic components assembly has a movable feeder carrier holding components, a movable X-Y table carrying a printed circuit board (PCB), and a rotary turret having multiple assembly heads. This paper presents a hybrid genetic algorithm to optimize the sequence of component placements for a chip shooter machine. The objective of the problem is to minimize the total traveling distance of the X-Y table or the board. The genetic algorithm developed in the paper hybridizes the nearest neighbor heuristic, and an iterated swap procedure, which is a new improved heuristic. We have compared the performance of the hybrid genetic algorithm with that of the approach proposed by other researchers and have demonstrated our algorithm is superior in terms of the distance traveled by the X-Y table or the board.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Synaptic plasticity is the dynamic regulation of the strength of synaptic communication between nerve cells. It is central to neuronal development as well as experience-dependent remodeling of the adult nervous system as occurs during memory formation. Aberrant forms of synaptic plasticity also accompany a variety of neurological and psychiatric diseases, and unraveling the biological basis of synaptic plasticity has been a major goal in neurobiology research. The biochemical and structural mechanisms underlying different forms of synaptic plasticity are complex, involving multiple signaling cascades, reconfigurations of structural proteins and the trafficking of synaptic proteins. As such, proteomics should be a valuable tool in dissecting the molecular events underlying normal and disease-related forms of plasticity. In fact, progress in this area has been disappointingly slow. We discuss the particular challenges associated with proteomic interrogation of synaptic plasticity processes and outline ways in which we believe proteomics may advance the field over the next few years. We pay particular attention to technical advances being made in small sample proteomics and the advent of proteomic imaging in studying brain plasticity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rotating fluidised Beds offer the potential for high intensity combustion, large turndown and extended range of fluidising velocity due to the imposition of an artificial gravitational field. Low thermal capacity should also allow rapid response to load changes. This thesis describes investigations of the validity of these potential virtues. Experiments, at atmospheric pressure, were conducted in flow visualisation rigs and a combustor designed to accommodate a distributor 200mm diameter and 80mm axial length. Ancillary experiments were conducted in a 6" diameter conventional fluidised bed. The investigations encompassed assessment of; fluidisation and elutriation, coal feed requirements, start-up and steady-state combustion using premixed propane and air, transition from propane to coal combustion and mechanical design. Assessments were made of an elutriation model and some effects of particle size on the combustion of premixed fuel gas and air. The findings were: a) more reliable start-up and control methods must be developed. Combustion of premixed propane and air led to severe mechanical and operating problems. Manual control of coal combustion was inadequate. b) Design criteria must encompass pressure loss, mechanical strength and high temperature resistance. The flow characteristics of ancillaries and the distributor must be matcheo. c) Fluidisation of a range of particle sizes was investigated. New correlations for minimum fluidisation and fully supported velocities are proposed. Some effects on elutriation of particle size and the distance between the bed surface and exhaust port have been identified. A conic distributor did not aid initial bed distribution. Furthermore, airflow instability was encountered with this distributor shape. Future use of conic distributors is not recommended. Axial solids mixing was found to be poor. A coal feeder was developed which produced uniform fuel distribution throughout the bed. The report concludes that small scale inhibits development of mechanical design and exploration of performance. future research requires larger combustors and automatic control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes the geology, geochemistry and mineralogy of a Lower Proterozoic, metamorphosed volcanogenic Cu-Zn deposit, situated at the western end of the Flin Flon greenstone belt. Stratabound copper mineralisation occurs in silicified and chloritoid-bearing alteration assemblages within felsic tuffs and is mantled by thin (< 3m) high-grade sphalerite layers. Mineralisation is underlain by garnet-hornblende bearing Lower Iron Formation (LIF), and overlain by garnet-grunerite bearing Upper Iron Formation (UIF). Distinctive trace element trends, involving Ti and Zr, in mineralised and footwall felsic tuffs are interpreted to have formed by fractionation associated with a high-level magma chamber in a caldera-type environment. Discrimination diagrams for basaltic rocks are interpreted to indicate their formation in an environment similar to that of recent, primitive, tholeiitic island arcs. Microprobe studies of key mineral phases demonstrate large and small scale chemical variations in silicate phases related to primary lithological, rather than metamorphic, controls. LIF is characterised by alumino-ferro-tschermakite and relatively Mn-poor, Ca-rich garnets, whereas UIF contains manganoan grunerite and Mn-rich garnets. Metamorphic mineral reactions are considered and possible precursor assemblages identified for garnet-, and chloritoid-bearing rocks. Chloritoid-bearing rocks are interpreted as the metamorphosed equivalents of iron-rich feeder zones formed near the surface. The iron-formations are thought to represent iron-rich sediments formed on the sea floor formed from the venting of the ore fluids. Consideration of various mineral assemblages leads to an estimate for peak metamorphic conditions of 450-500oC and > 4Kb total pressure. Comparisons with other volcanogenic deposits indicate affinities with deposits of `Mattabi-type' from the Archean of Ontario. An extrapolation of the main conclusions of the thesis to adjacent areas points to the presence of a number of geologically similar localities with potential for mineralisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project evaluates the benefits of meshing existing 11kV radial networks in order to reduce losses and maximise the connection of low carbon distributed generation. These networks are often arranged as radial feeders with normally-open links between two of the feeders; the link is closed only to enable continuity of supply to an isolated portion of a feeder following a fault on the network. However, this link could also be closed permanently thus operating the network as a meshed topology under non-faulted conditions. The study will look at loss savings and the addition of distributed generation on a typical network under three different scenarios; traditional radial feeders, fixed meshed network and a dynamic meshed network. The networks are compared in terms of feeder losses, capacity, voltage regulation and fault levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyzes the physical phenomena that take place inside an 1 kg/h bubbling fluidized bed reactor located at Aston University and presents a geometrically modified version of it, in order to improve certain hydrodynamic and gas flow characteristics. The bed uses, in its current operation, 40 L/min of N2 at 520 °C fed through a distributor plate and 15 L/min purge gas stream, i.e., N2 at 20 °C, via the feeding tube. The Eulerian model of FLUENT 6.3 is used for the simulation of the bed hydrodynamics, while the k - ε model accounts for the effect of the turbulence field of one phase on the other. The three-dimensional simulation of the current operation of the reactor showed that a stationary bubble was formed next to the feeding tube. The size of the permanent bubble reaches up to the splash zone of the reactor, without any fluidizaton taking place underneath the feeder. The gas flow dynamics in the freeboard of the reactor is also analyzed. A modified version of the reactor is presented, simulated, and analyzed, together with a discussion on the impact of the flow dynamics on the fast pyrolysis of biomass. © 2010 American Chemical Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An unprecedented series of ecological disturbances have been recurring within Florida Bay since the summer of 1987. Persistent and widespread phytoplankton and cyanobacteria blooms have coincided with the large scale decimation of sponge communities. One hypothesis is that the large scale loss of suspension-feeding sponges has rendered the Florida Bay ecosystem susceptible to these recurring blooms. The primary objective of this study was to experimentally evaluate the potential for suspension-feeding sponges to control nuisance phytoplankton blooms within Florida Bay prior to a large sponge die-off event. To achieve this objective, we determined the extent and biomass of the surviving sponge community in the different basins of Florida Bay. Many areas within Florida Bay possessed sponge densities and biomasses of 1 to 3 ind. m–2 or 100 to 300 g m–2 respectively. The dominant species includedSpheciospongia vesparia, Chondrilla nucula, Cinachyra alloclada, Tedania ignis and Ircinia sp., which accounted for 68% of individual sponges observed and 88% of sponge biomass. Laboratory grazing rates of these dominant sponges were experimentally determined on 4 different algal food treatments: a monoculture of cyanobacteria Synechococcus elongatus, a monoculture of the diatom Cyclotella choctawhatcheeana, a monoculture of the dinoflagellate Prorocentrum hoffmanianum, and an equal volume of the 3 monocultures combined. To estimate the impact of a mass sponge mortality event on the system-wide filtration rate of Florida Bay, we combined estimates of the current sponge biomass and laboratory sponge filtration rates with estimates of mean volumes of the sub-basins of Florida Bay. This study implies that the current blooms occurring within the central region of Florida Bay can be explained by the loss of the dominant suspension feeder in this system, and there is no need to invoke a new addition of nutrients within this region for the blooms to occur.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In - Commuter Airlines: Their Changing Role – an essay by J. A. F. Nicholls, Transportation Coordinator, Department of Marketing and Environment, College of Business Administration, Florida International University, Nicholls initially observes: “The great majority of airline passenger miles flown in the United States are between large conurbations. People living in metropolitan areas may be quite unaware of commuter airlines and their role in our transportation system. These airlines are, however, communications lifelines for dwellers in small - and not so small - towns and rural areas. More germanely, commuter airlines have also developed a pivotal role vis-a-vis the major carriers in this country. The author discusses the antecedents of the commuter Airlines, their current role, and future prospects.” Huh; conurbations? Definition: [n.] a large urban area created when neighboring towns spread into and merge with each other In providing a brief history on the subject of commuter airlines, Nicholls states: “…there had been a sort of commuter airline as far back as 1926 when, for example, the Florida Airways Corporation provided flights between Jacksonville and Atlanta, Colonial Air Lines between New York and Boston, and Ford Air Transport from Detroit to Cleveland.” “The passage of the Civil Aeronautics Act in 1938 was pivotal in encouraging and developing a passenger orientation by the airlines…” Nicholls informs you. Nicholls provides for the importance of this act by saying: “The CAA was empowered to act “in the public interest and in accordance with the public convenience and necessity.” Only the CAA itself could determine what constituted the “public convenience and necessity.” Nobody, however, could provide air transportation for public purposes without a Certificate of Public Convenience and Necessity, dispensed by the CAA.” The author wants you to know that this all happens in the age of airline regulation; that is to say, pre de-regulation i.e. 1978. Airlines could not and did not act on their own behalf; their actions were governed by the regulating agency, that being the Civil Aeronautics Board [CAB], who administered the conditions set forth by the CAA. “In 1944 the CAB introduced a new category of service called feeder airlines to provide local service-short-haul, low density-for smaller communities. These carriers soon became known as air taxis since they operated as common carriers, without a regular schedule,” says Nicholls in describing the evolution of the service. In 1969 the CAB officially designated these small air carriers as commuter airlines. They were, and are subject to passenger limits and freight/weight restrictions. Nicholls continues by defining how air carriers are labeled and categorized post 1978; in the age of de-regulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated the influence that receiving instruction in two languages, English and Spanish, had on the performance of students enrolled in the International Studies Program (delayed partial immersion model) of Miami Dade County Public Schools on a standardized test in English, the Stanford Achievement Test, eighth edition, for three of its sections, Reading Comprehension, Mathematics Computations, and Mathematics Applications.^ The performance of the selected IS program/Spanish section cohort of students (N = 55) on the SAT Reading Comprehension, Mathematics Computation, and Mathematics Application along four consecutive years was contrasted with that of a control group of comparable students selected within the same feeder pattern where the IS program is implemented (N = 21). The performance of the group was also compared to the cross-sectional achievement patterns of the school's corresponding feeder pattern, region, and district.^ The research model for the study was a variation of the "causal-comparative" or "ex post facto design" sometimes referred to as "prospective". After data were collected from MDCPS, t-tests were performed to compare IS-Spanish students SAT performance for grades 3 to 6 for years 1994 to 1997 to control group, feeder pattern, region and district norms for each year for Reading Comprehension, Mathematics Computation, and Mathematics Applications. Repeated measures ANOVA and Tukey's tests were calculated to compare the mean percentiles of the groups under study and the possible interactions of the different variables. All tests were performed at the 5% significance level.^ From the analyses of the tests it was deduced that the IS group performed significantly better than the control group for all the three measures along the four years. The IS group mean percentiles on the three measures were also significantly higher than those of the feeder pattern, region, and district. The null hypotheses were rejected and it was concluded that receiving instruction in two languages did not negatively affect the performance of IS program students on tests taken in English. It was also concluded that the particular design the IS program enhances the general performance of participant students on Standardized tests.^ The quantitative analyses were coupled with interviews from teachers and administrators of the IS program to gain additional insight about different aspects of the implementation of the program at each particular school. ^