983 resultados para Border Industrial Complex
Resumo:
The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.
Resumo:
The research examines the deposition of airborne particles which contain heavy metals and investigates the methods that can be used to identify their sources. The research focuses on lead and cadmium because these two metals are of growing public and scientific concern on environmental health grounds. The research consists of three distinct parts. The first is the development and evaluation of a new deposition measurement instrument - the deposit cannister - designed specifically for large-scale surveys in urban areas. The deposit cannister is specifically designed to be cheap, robust, and versatile and therefore to permit comprehensive high-density urban surveys. The siting policy reduces contamination from locally resuspended surface-dust. The second part of the research has involved detailed surveys of heavy metal deposition in Walsall, West Midlands, using the new high-density measurement method. The main survey, conducted over a six-week period in November - December 1982, provided 30-day samples of deposition at 250 different sites. The results have been used to examine the magnitude and spatial variability of deposition rates in the case-study area, and to evaluate the performance of the measurement method. The third part of the research has been to conduct a 'source-identification' exercise. The methods used have been Receptor Models - Factor Analysis and Cluster Analysis - and a predictive source-based deposition model. The results indicate that there are six main source processes contributing to deposition of metals in the Walsall area: coal combustion, vehicle emissions, ironfounding, copper refining and two general industrial/urban processes. |A source-based deposition model has been calibrated using facctorscores for one source factor as the dependent variable, rather than metal deposition rates, thus avoiding problems traditionally encountered in calibrating models in complex multi-source areas. Empirical evidence supports the hypothesised associatlon of this factor with emissions of metals from the ironfoundry industry.
Resumo:
New techniques in manufacturing, popularly referred to as mechanization and automation, have been a preoccupation of social and economic theorists since the industrial revolution. A selection of relevant literature is reviewed, including the neoclassical economic treatment of technical change. This incorporates alterations to the mathematical production function and an associated increase in the efficiency with which the factors of production are converted into output. Other work emphasises the role of research and development and the process of diffusion, whereby new production techniques are propagated throughout industry. Some sociological writings attach importance to the type of production technology and its effect on the organisational structure and social relations within the factory. Nine detailed case studies are undertaken of examples of industrial innovation in the rubber, automobile, vehicle components, confectionery and clothing industries. The old and new techniques are compared for a range of variables, including capital equipment, labour employed, raw materials used, space requirements and energy consumption, which in most cases exhibit significant change with the innovation. The rate of output, labour productivity, product quality, maintenance requirements and other aspects are also examined. The process by which the change in production method was achieved is documented, including the development of new equipment and the strategy of its introduction into the factory, where appropriate. The firm, its environment, and the attitude of different sectors of the workforce are all seen to play a part in determining the motives for and consequences which flow from the innovations. The traditional association of technical progress with its labour-saving aspect, though an accurate enough description of the cases investigated, is clearly seen to afford an inadequate perspective for the proper understanding of this complex phenomenon, which also induces change in a wide range of other social, economic and technical variables.
Resumo:
Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.
Resumo:
The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^
Resumo:
Objectionable odors remain at the top of air pollution complaints in urban areas such as Broward County that is subject to increasing residential and industrial developments. The odor complaints in Broward County escalated by 150 percent for the 2001 to 2004 period although the population increased by only 6 percent. It is estimated that in 2010 the population will increase to 2.5 million. Relying solely on enforcing the local odor ordinance is evidently not sufficient to manage the escalating odor complaint trends. An alternate approach similar to odor management plans (OMPs) that are successful in managing major malodor sources such as animal farms is required. ^ This study aims to develop and determine the feasibility of implementing a comprehensive odor management plan (COMP) for the entire Broward County. Unlike existing OMPs for single sources where the receptors (i.e. the complainants) are located beyond the boundary of the source, the COMP addresses a complex model of multiple sources and receptors coexisting within the boundary of the entire county. Each receptor is potentially subjected to malodor emissions from multiple sources within the county. Also, the quantity and quality of the source/receptor variables are continuously changing. ^ The results of this study show that it is feasible to develop a COMP that adopts a systematic procedure to: (1) Generate maps of existing odor complaint areas and malodor sources, (2) Identify potential odor sources (target sources) responsible for existing odor complaints, (3) Identify possible odor control strategies for target sources, (4) Determine the criteria for implementing odor control strategies, (5) Develop an odor complaint response protocol, and (6) Conduct odor impact analyses for new sources to prevent future odor related issues. Geographic Information System (GIS) is used to identify existing complaint areas. A COMP software that incorporates existing United States Environmental Protection Agency (EPA) air dispersion software is developed to determine the target sources, predict the likelihood of new complaints, and conduct odor impact analysis. The odor response protocol requires pre-planning field investigations and conducting surveys to optimize the local agency available resources while protecting the citizen's welfare, as required by the Clean Air Act. ^
Resumo:
Rapid development in industry have contributed to more complex systems that are prone to failure. In applications where the presence of faults may lead to premature failure, fault detection and diagnostics tools are often implemented. The goal of this research is to improve the diagnostic ability of existing FDD methods. Kernel Principal Component Analysis has good fault detection capability, however it can only detect the fault and identify few variables that have contribution on occurrence of fault and thus not precise in diagnosing. Hence, KPCA was used to detect abnormal events and the most contributed variables were taken out for more analysis in diagnosis phase. The diagnosis phase was done in both qualitative and quantitative manner. In qualitative mode, a networked-base causality analysis method was developed to show the causal effect between the most contributing variables in occurrence of the fault. In order to have more quantitative diagnosis, a Bayesian network was constructed to analyze the problem in probabilistic perspective.
Resumo:
Faced with an agribusiness expansion scenario and the increase in fertilizer consumption due to the exponential growth of the population, it is necessary to make better use of existing reserves, by obtaining products of better quality and in adequate quantities to meet demand national. In Tapira Mining Complex, Vale Fertilizantes, the phosphate concentrate is produced with content of 35.0% P2O5 from ore with content of about 8.0% P2O5, which are intended to supply Complex Industrial Uberaba and Araxá Minero Chemical Complex for the production of fertilizers. The industrial flotation step responsible for the recovery of P2O5 and hence the viability of the business is divided into the crumbly, grainy and ultrathin circuits, and, friable and granular concentrate comprise the conventional concentrated. Today only 14.7% of the mass which feeds the mill product becomes, the remainder being considered losses in the process, and the larger mass losses are located in the waste of flotation, representing 42.3%. From 2012 to 2014, the daily global mass recovery processing plants varied from 12.4 to 15.9% while the daily metallurgical recovery of P2O5 from 48.7 to 82.4%. By the degree of variability, it appears that the plant operated under different conditions. Seen this, this study aimed to analyze the influence of operational and process variables in P2O5 mass and metallurgical recoveries of industrial flotation circuits of grainy, crumbly and ultrathin. And besides was made an analysis of the effect of ore variables, as degrees, hardnesse and the ore front 02 percentage, in global recoveries of processing plant and the effect of dosages of reagents in the recoveries obtained from the bench flotation using the experimental design methodology. All work was performed using the historical database of Vale Fertilizantes of Tapira-MG, where all independent variables were dimensionless as the experimental range used. To make the statistical analysis it used the response surface technique and the values of the independent variables that maximize recoveries were found by canonical analysis. In the study of industrial flotation circuit crispy were obtained from 41.3% mass recovery and 91.3% metallurgical recovery P2O5, good values for the circuit, and the highest recoveries occur for solids concentration of the new flotation power between 45 and 50%, which values are assigned to the residence time of the pulp in cells and industrial flotation columns. The greater the number of ore heaps resumed on the higher the mass recovery, but in this scenario flotation becomes unstable because there is enormous weight variation in the feed. Higher recoveries are obtained for mass depressant dosage exceeding 120 g / t for synthetic collector dosage of 11.6%. In the study of industrial flotation circuit of the granulate were obtained 28.3% to 79.4% mass recovery and metallurgical recovery of P2O5 being considered good values for the circuit. Higher recoveries are obtained when the front ore 02 percentage is above 90%, because the ore of this front have more clear apatite. Likewise recoveries increase when the level of pulp rougher step is highest due to the high mass of circulating step receives loads. In the analysis of industrial flotation circuit of the ultrafine were obtained 23.95% of mass recovery, and the same is maximized to depressant dosage and the top collector 420 and 300 g / t, respectively. The analysis of the influence of variables ore, it was observed that higher recoveries are obtained for ores with P2O5 content above 8.0%, Fe2O3 content in the order of 28% forward and 02 of ore percentage of 83%. Hard ore percentage has strong influence on recoveries due to mass division in the circuit that is linked to this variable. However, the hard ore percentage that maximizes recoveries was very close to the design capacity of the processing plant, which is 20%. Finally, the study of the bench flotation, has noted that in friable and granular circuits the highest recoveries are achieved for a collector dosage exceeding 250 g / t and the simultaneous increase of collector dosage and synthetic collector percentage contributes to the increase recovery in the flotation, but this scenario is suitable to produce a concentrate poorer in terms of P2O5 content, showing that higher recovery is not always the ideal scenario. Thus, the results show the values of variables that provide higher recoveries in the flotation and hence lower losses in the waste.
Resumo:
Rapid technological advances and liberal trade regimes permit functional reintegration of dispersed activities into new border-spanning business networks variously referred to as global value chains (GVCs). Given that the gains of a country from GVCs depend on the activities taking place in its jurisdiction and their linkages to global markets, this study starts by providing a descriptive overview of China’s economic structure and trade profile. The first two chapters of this paper demonstrate what significant role GVCs have played in China’s economic growth, evident in enhanced productivity, diversification, and sophistication of China’s exports, and how these economic benefits have propelled China’s emergence as the world’s manufacturing hub in the past two decades. However, benefits from GVC participation – in particular technological learning, knowledge building, and industrial upgrading – are not automatic. What strategies would help Chinese industries engage with GVCs in ways that are deemed sustainable in the long run? What challenges and related opportunities China would face throughout the implementation process? The last two chapters of this paper focus on implications of GVCs for China’s industrial policy and development. Chapter Three examines how China is reorienting its manufacturing sector toward the production of higher value-added goods and expanding its service sector, both domestically and internationally; while Chapter Four provides illustrative policy recommendations on dealing with the positive and negative outcomes triggered by GVCs, within China and beyond the country’s borders. To the end, this study also hopes to shed some light on the lessons and complexities that arise from GVC participation for other developing countries.
Higher Education - Border or Boundary? Can Theatre in Education Help Promote a University Education?
Resumo:
With the expansion and increased availability of Higher Education the progression to study for an undergraduate degree has been viewed as a simple stepping stone with examination success a straight - forward border pass. Changes in the funding of degree courses has established a series of more challenging boundaries to entry which demand a rigorous assessment of the benefits of Higher Education. The Widening Participation Unit at The University of Worcester has sought to ease this border crossing for pupils whose parents have not been to university. Their experience from previous projects was that school pupils more easily relate to undergraduate students whose experience of Higher Education is recent and relevant. With this in mind they commissioned the Drama and Performance Department to create a Theatre in Education programme that introduced an awareness of post sixteen options and future choices to challenge Higher Education stereotypes. As a result of this collaboration Why Bother? was created, directed by myself and devised and researched with four students who were studying drama. Their own experiences were used to inform the character development and dealt with worrying as a mature student about integration into full – time education, loss of income after working, the pressures of emotional commitments to partners and being away from home. The programme toured to two thousand year 9 – 11 pupils in Worcestershire and Herefordshire schools in January and May 2011. Devising and touring Why Bother provided students with an opportunity to work as a professional paid TIE team that it is not possible for them to do as part of their undergraduate degree course. My initial research looks at the effectiveness and limitations of this project based on pupil questionnaires and the experiences of the team which are explored within the broader context of TIE and its potential for affecting attitudinal change. This has given rise to a number of questions that need consideration in the development of a new TIE programme aimed at raising the awareness of sixth form students who are about to make the decision whether to apply to university or not. Collaboration with university students in exploring the value of an education that they have subscribed to raises issues of bias and whether their powers of persuasion actually prevent pupils from making their own individual decision. The ethics of promoting a “free” university education seem much less complex than the decision required now which involves balancing the real value against the high financial cost suggested in the working title of Is it Worth it? This paper will present my first attempts to develop research methods and methodologies that will enable me to evaluate the success of this and future TIE.
Resumo:
O presente trabalho incidiu sobre uma família de eletrólitos sólidos cerâmicos à base de óxido de zircónio, incluindo ainda óxido de magnésio como dopante, normalmente designados de Mg-PSZ (zircónia parcialmente estabilizada com magnésia). Dependendo da composição e condições de processamento (perfil de sinterização) estes materiais podem exibir interessantes combinações de propriedades mecânicas, térmicas e elétricas que permitem a sua utilização no fabrico de sensores de oxigénio para metais fundidos. O uso de sensores é hoje essencial numa lógica de controlo de processo e eficiência energética. No sentido de tentar compreender como influenciar estas propriedades, exploraram-se diversos níveis de dopante (de 2,5 até 10 mol%, com acréscimos de 2,5 mol% de MgO), diversas velocidades de arrefecimento (2, 3 e 5 °C.min-1) a partir de uma condição igual de patamar de sinterização (1700 °C, 3 horas), e ainda alguns ciclos de sinterização mais complexos, com patamares intermédios inseridos no processo de arrefecimento, com o objetivo de tentar alterar os processos de nucleação e crescimento de fases. Na realidade, as transformações de fases a que este tipo de materiais se encontra sujeito (cúbica tetragonal monoclínica, para temperaturas decrescentes), possuem diferentes velocidades características (uma é difusiva a outra displaciva), permitindo este tipo de condicionamento. Os materiais obtidos foram alvo de caracterização estrutural e microestrutural, complementada por um conjunto de outras técnicas de caracterização física como a espectroscopia de impedância, dilatometria e dureza. Os resultados obtidos confirmam a complexidade das relações entre processamento e comportamento mas permitiram identificar condições de potencial interesse prático para as aplicações em vista.
Resumo:
Have been less than thirty years since a group of graduate students and computer scientists working on a federal contract performed the first successful connection between two computers located at remote sites. This group known as the NWG Network Working Group, comprised of highly creative geniuses who as soon as they began meeting started talking about things like intellectual graphics, cooperating processes, automation questions, email, and many other interesting possibilities 1 . In 1968, the group's task was to design NWG's first computer network, in October 1969, the first data exchange occurred and by the end of that year a network of four computers was in operation. Since the invention of the telephone in 1876 no other technology has revolutionized the field of communications over the computer network. The number of people who have made great contributions to the creation and development of the Internet are many, the computer network a much more complex than the phone is the result of people of many nationalities and cultures. However, remember that some years later in 19732 two computer scientists Robert Kahn and Vinton Cerft created a more sophisticated communication program called Transmission Control Protocol - Internet Protocol TCP / IP which is still in force in the Internet today.
Resumo:
The study analyzed hydro-climatic and land use sensitivities of stormwater runoff and quality in the complex coastal urban watershed of Miami River Basin, Florida by developing a Storm Water Management Model (EPA SWMM 5). Regression-based empirical models were also developed to explain stream water quality in relation to internal (land uses and hydrology) and external (upstream contribution, seawater) sources and drivers in six highly urbanized canal basins of Southeast Florida. Stormwater runoff and quality were most sensitive to rainfall, imperviousness, and conversion of open lands/parks to residential, commercial and industrial areas. In-stream dissolved oxygen and total phosphorus in the watersheds were dictated by internal stressors while external stressors were dominant for total nitrogen and specific conductance. The research findings and tools will be useful for proactive monitoring and management of storm runoff and urban stream water quality under the changing climate and environment in South Florida and around the world.
Resumo:
In this study, the lubrication theory is used to model flow in geological fractures and analyse the compound effect of medium heterogeneity and complex fluid rheology. Such studies are warranted as the Newtonian rheology is adopted in most numerical models because of its ease of use, despite non-Newtonian fluids being ubiquitous in subsurface applications. Past studies on Newtonian and non-Newtonian flow in single rock fractures are summarized in Chapter 1. Chapter 2 presents analytical and semi-analytical conceptual models for flow of a shear-thinning fluid in rock fractures having a simplified geometry, providing a first insight on their permeability. in Chapter 3, a lubrication-based 2-D numerical model is first implemented to solve flow of an Ellis fluid in rough fractures; the finite-volumes model developed is more computationally effective than conducting full 3-D simulations, and introduces an acceptable approximation as long as the flow is laminar and the fracture walls relatively smooth. The compound effect of shear-thinning fluid nature and fracture heterogeneity promotes flow localization, which in turn affects the performance of industrial activities and remediation techniques. In Chapter 4, a Monte Carlo framework is adopted to produce multiple realizations of synthetic fractures, and analyze their ensemble statistics pertaining flow for a variety of real non-Newtonian fluids; the Newtonian case is used as a benchmark. In Chapter 5 and Chapter 6, a conceptual model of the hydro-mechanical aspects of backflow occurring in the last phase of hydraulic fracturing is proposed and experimentally validated, quantifying the effects of the relaxation induced by the flow.
Resumo:
The topic of seismic loss assessment not only incorporates many aspects of the earthquake engineering, but also entails social factors, public policies and business interests. Because of its multidisciplinary character, this process may be complex to challenge, and sound discouraging to neophytes. In this context, there is an increasing need of deriving simplified methodologies to streamline the process and provide tools for decision-makers and practitioners. This dissertation investigates different possible applications both in the area of modelling of seismic losses, both in the analysis of observational seismic data. Regarding the first topic, the PRESSAFE-disp method is proposed for the fast evaluation of the fragility curves of precast reinforced-concrete (RC) structures. Hence, a direct application of the method to the productive area of San Felice is studied to assess the number of collapses under a specific seismic scenario. In particular, with reference to the 2012 events, two large-scale stochastic models are outlined. The outcomes of the framework are promising, in good agreement with the observed damage scenario. Furthermore, a simplified displacement-based methodology is outlined to estimate different loss performance metrics for the decision-making phase of the seismic retrofit of a single RC building. The aim is to evaluate the seismic performance of different retrofit options, for a comparative analysis of their effectiveness and the convenience. Finally, a contribution to the analysis of the observational data is presented in the last part of the dissertation. A specific database of losses of precast RC buildings damaged by the 2012 Earthquake is created. A statistical analysis is performed, allowing deriving several consequence functions. The outcomes presented may be implemented in probabilistic seismic risk assessments to forecast the losses at the large scale. Furthermore, these may be adopted to establish retrofit policies to prevent and reduce the consequences of future earthquakes in industrial areas.