866 resultados para Technicans in industry
Resumo:
This dissertation presents and evaluates a methodology for scheduling medical application workloads in virtualized computing environments. Such environments are being widely adopted by providers of "cloud computing" services. In the context of provisioning resources for medical applications, such environments allow users to deploy applications on distributed computing resources while keeping their data secure. Furthermore, higher level services that further abstract the infrastructure-related issues can be built on top of such infrastructures. For example, a medical imaging service can allow medical professionals to process their data in the cloud, easing them from the burden of having to deploy and manage these resources themselves. In this work, we focus on issues related to scheduling scientific workloads on virtualized environments. We build upon the knowledge base of traditional parallel job scheduling to address the specific case of medical applications while harnessing the benefits afforded by virtualization technology. To this end, we provide the following contributions: (1) An in-depth analysis of the execution characteristics of the target applications when run in virtualized environments. (2) A performance prediction methodology applicable to the target environment. (3) A scheduling algorithm that harnesses application knowledge and virtualization-related benefits to provide strong scheduling performance and quality of service guarantees. In the process of addressing these pertinent issues for our target user base (i.e. medical professionals and researchers), we provide insight that benefits a large community of scientific application users in industry and academia. Our execution time prediction and scheduling methodologies are implemented and evaluated on a real system running popular scientific applications. We find that we are able to predict the execution time of a number of these applications with an average error of 15%. Our scheduling methodology, which is tested with medical image processing workloads, is compared to that of two baseline scheduling solutions and we find that it outperforms them in terms of both the number of jobs processed and resource utilization by 20–30%, without violating any deadlines. We conclude that our solution is a viable approach to supporting the computational needs of medical users, even if the cloud computing paradigm is not widely adopted in its current form.
Resumo:
The hotel industry has been experiencing a severe labor shortage in recent years. The need for organizations to attempt to retain current employees has increased as a direct result of this shortage. An area that has not received as much attention in industry literature is to look at what may be the determinants and the predictors of the turnover process. The authors’ discuss the role of specific intentions, reasoned action, and job satisfaction and the implications of these factors for hotel managers.
Resumo:
Nowadays wireless communication has emerged as a tendency in industry environments. In part this interest is due to the ease of deployment and maintenance, which dispenses sophisticated designs and wired infrastructure (which in industrial environment often prohibitively expensive) besides enabling the addition of new applications when compared to their wired counterparts. Despite its high degree of applicability, an industrial wireless sensor network faces some challenges. One of the most challenging problems are its reliability, energy consumption and the environment interference. In this dissertation will discuss the problem of asset analysis in wireless industrial networks for the WirelessHART standard by implementing a monitoring system. The system allows to carry out various activities of independent asset management manufacturers, such as prediction of battery life, maintenance, reliability data, topology, and the possibility of creating new metrics from open and standardized development libraries. Through the implementation of this tool is intended to contribute to integration of wireless technologies in industrial environments.
Resumo:
Searches using organoclays have been the subject of great interest due to its wide application in industry and removal of environmental pollutants. The organoclays were obtained using bentonite (BEN) and cationic surfactants: hexadecyltrimethyl ammonium bromide (HDTMA-Br) and trimethyloctadecyl ammonium bromide (TMOA-Br) in ratios of 50 and 100 % of its ion exchange capacity. The materials were characterized by the techniques of X-ray diffraction (DRX), infrared spectroscopy (IR), X-ray fluorescence (FRX), thermal analysis (TA) and scanning electron microscopy (SEM). The bentonite and organobentonite were used on the adsorption of dyes, Remazol Blue RR (AZ) and Remazol Red RR (VM) in aqueous solution. The adsorption models of Langmuir and Freundlich were used for mathematical description of sorption equilibrium data and obtain the constants of the isotherms. The Freundlich model fit to the data for adsorption equilibrium of bentonite, on the other hand both the model fit to the Langmuir adsorption test of organoclays. The adsorption processes using adsorbents with both dyes interspersed with HDTMA-Br show endothermic and exothermic nature, respectively.
Resumo:
The great amount of data generated as the result of the automation and process supervision in industry implies in two problems: a big demand of storage in discs and the difficulty in streaming this data through a telecommunications link. The lossy data compression algorithms were born in the 90’s with the goal of solving these problems and, by consequence, industries started to use those algorithms in industrial supervision systems to compress data in real time. These algorithms were projected to eliminate redundant and undesired information in a efficient and simple way. However, those algorithms parameters must be set for each process variable, becoming impracticable to configure this parameters for each variable in case of systems that monitor thousands of them. In that context, this paper propose the algorithm Adaptive Swinging Door Trending that consists in a adaptation of the Swinging Door Trending, as this main parameters are adjusted dynamically by the analysis of the signal tendencies in real time. It’s also proposed a comparative analysis of performance in lossy data compression algorithms applied on time series process variables and dynamometer cards. The algorithms used to compare were the piecewise linear and the transforms.
Resumo:
Advanced Oxidation Processes (AOP) are techniques involving the formation of hydroxyl radical (HO•) with high organic matter oxidation rate. These processes application in industry have been increasing due to their capacity of degrading recalcitrant substances that cannot be completely removed by traditional processes of effluent treatment. In the present work, phenol degrading by photo-Fenton process based on addition of H2O2, Fe2+ and luminous radiation was studied. An experimental design was developed to analyze the effect of phenol, H2O2 and Fe2+ concentration on the fraction of total organic carbon (TOC) degraded. The experiments were performed in a batch photochemical parabolic reactor with 1.5 L of capacity. Samples of the reactional medium were collected at different reaction times and analyzed in a TOC measurement instrument from Shimadzu (TOC-VWP). The results showed a negative effect of phenol concentration and a positive effect of the two other variables in the TOC degraded fraction. A statistical analysis of the experimental design showed that the hydrogen peroxide concentration was the most influent variable in the TOC degraded fraction at 45 minutes and generated a model with R² = 0.82, which predicted the experimental data with low precision. The Visual Basic for Application (VBA) tool was used to generate a neural networks model and a photochemical database. The aforementioned model presented R² = 0.96 and precisely predicted the response data used for testing. The results found indicate the possible application of the developed tool for industry, mainly for its simplicity, low cost and easy access to the program.
Resumo:
Advanced Oxidation Processes (AOP) are techniques involving the formation of hydroxyl radical (HO•) with high organic matter oxidation rate. These processes application in industry have been increasing due to their capacity of degrading recalcitrant substances that cannot be completely removed by traditional processes of effluent treatment. In the present work, phenol degrading by photo-Fenton process based on addition of H2O2, Fe2+ and luminous radiation was studied. An experimental design was developed to analyze the effect of phenol, H2O2 and Fe2+ concentration on the fraction of total organic carbon (TOC) degraded. The experiments were performed in a batch photochemical parabolic reactor with 1.5 L of capacity. Samples of the reactional medium were collected at different reaction times and analyzed in a TOC measurement instrument from Shimadzu (TOC-VWP). The results showed a negative effect of phenol concentration and a positive effect of the two other variables in the TOC degraded fraction. A statistical analysis of the experimental design showed that the hydrogen peroxide concentration was the most influent variable in the TOC degraded fraction at 45 minutes and generated a model with R² = 0.82, which predicted the experimental data with low precision. The Visual Basic for Application (VBA) tool was used to generate a neural networks model and a photochemical database. The aforementioned model presented R² = 0.96 and precisely predicted the response data used for testing. The results found indicate the possible application of the developed tool for industry, mainly for its simplicity, low cost and easy access to the program.
Resumo:
Rapid development in industry have contributed to more complex systems that are prone to failure. In applications where the presence of faults may lead to premature failure, fault detection and diagnostics tools are often implemented. The goal of this research is to improve the diagnostic ability of existing FDD methods. Kernel Principal Component Analysis has good fault detection capability, however it can only detect the fault and identify few variables that have contribution on occurrence of fault and thus not precise in diagnosing. Hence, KPCA was used to detect abnormal events and the most contributed variables were taken out for more analysis in diagnosis phase. The diagnosis phase was done in both qualitative and quantitative manner. In qualitative mode, a networked-base causality analysis method was developed to show the causal effect between the most contributing variables in occurrence of the fault. In order to have more quantitative diagnosis, a Bayesian network was constructed to analyze the problem in probabilistic perspective.
Resumo:
Laser micromachining is an important material processing technique used in industry and medicine to produce parts with high precision. Control of the material removal process is imperative to obtain the desired part with minimal thermal damage to the surrounding material. Longer pulsed lasers, with pulse durations of milli- and microseconds, are used primarily for laser through-cutting and welding. In this work, a two-pulse sequence using microsecond pulse durations is demonstrated to achieve consistent material removal during percussion drilling when the delay between the pulses is properly defined. The light-matter interaction moves from a regime of surface morphology changes to melt and vapour ejection. Inline coherent imaging (ICI), a broadband, spatially-coherent imaging technique, is used to monitor the ablation process. The pulse parameter space is explored and the key regimes are determined. Material removal is observed when the pulse delay is on the order of the pulse duration. ICI is also used to directly observe the ablation process. Melt dynamics are characterized by monitoring surface changes during and after laser processing at several positions in and around the interaction region. Ablation is enhanced when the melt has time to flow back into the hole before the interaction with the second pulse begins. A phenomenological model is developed to understand the relationship between material removal and pulse delay. Based on melt refilling the interaction region, described by logistic growth, and heat loss, described by exponential decay, the model is fit to several datasets. The fit parameters reflect the pulse energies and durations used in the ablation experiments. For pulse durations of 50 us with pulse energies of 7.32 mJ +/- 0.09 mJ, the logisitic growth component of the model reaches half maximum after 8.3 us +/- 1.1 us and the exponential decays with a rate of 64 us +/- 15 us. The phenomenological model offers an interpretation of the material removal process.
Resumo:
Esta investigación aborda el consumo que los jóvenes universitarios de España y Brasil realizan de las publicaciones para tabletas. A través del estudio de seis casos –las revistas españolas Don, VisàVis y Quality Sport, y los vespertinos brasileños O Globo a Mais, de Río de Janeiro; Estadão Noite, de Sao Paulo; y Diário do Nordeste Plus, de Fortaleza– se aplica una metodología cualitativa, el test de usabilidad, para detectar qué aspectos ralentizan y entorpecen la navegación en las nuevas generaciones de usuarios de medios móviles. A pesar de la influencia de las revistas impresas en la configuración de las publicaciones para tableta, los datos muestran que el usuario necesita “entrenarse” para conocer unas opciones de interacción a veces poco intuitivas o para las que carece de la madurez visual necesaria. Por ello las publicaciones más sencillas obtienen los mejores resultados de usabilidad.
Resumo:
Mevalonate pathway is of important clinical, pharmaceutical and biotechnological relevance. However, lack of the understanding of the phosphorylation mechanism of the kinases in this pathway has limited rationally engineering the kinases in industry. Here the phosphorylation reaction mechanism of a representative kinase in the mevalonate pathway, phosphomevalonate kinase, was studied by using molecular dynamics and hybrid QM/MM methods. We find that a conserved residue (Ser106) is reorientated to anchor ATP via a stable H-bond interaction. In addition, Ser213 located on the α-helix at the catalytic site is repositioned to further approach the substrate, facilitating the proton transfer during the phosphorylation. Furthermore, we elucidate that Lys101 functions to neutralize the negative charge developed at the β-, γ-bridging oxygen atom of ATP during phosphoryl transfer. We demonstrate that the dissociative catalytic reaction occurs via a direct phosphorylation pathway. This is the first study on the phosphorylation mechanism of a mevalonate pathway kinase. The elucidation of the catalytic mechanism not only sheds light on the common catalytic mechanism of GHMP kinase superfamily, but also provides the structural basis for engineering the mevalonate pathway kinases to further exploit their applications in the production of a wide range of fine chemicals such as biofuels or pharmaceuticals.
Resumo:
The paper addresses the technological change that is currently happening in industry. First, a review of the global trends that impact industrial developmentsis made, then a summary ofexpanding intelligent technologies and their systems. The report describes in detail the concept of Industry 4.0 and its major technology-related aspects. At the end of the paper, a summary of social consequences is addressed, especially concerning generational concerns connected to the current change in industrial technology. The purpose of the study is to raise some special aspects and considerations in the given subject.
Resumo:
The semiconductor industry's urge towards faster, smaller and cheaper integrated circuits has lead the industry to smaller node devices. The integrated circuits that are now under volume production belong to 22 nm and 14 nm technology nodes. In 2007 the 45 nm technology came with the revolutionary high- /metal gate structure. 22 nm technology utilizes fully depleted tri-gate transistor structure. The 14 nm technology is a continuation of the 22 nm technology. Intel is using second generation tri-gate technology in 14 nm devices. After 14 nm, the semiconductor industry is expected to continue the scaling with 10 nm devices followed by 7 nm. Recently, IBM has announced successful production of 7 nm node test chips. This is the fashion how nanoelectronics industry is proceeding with its scaling trend. For the present node of technologies selective deposition and selective removal of the materials are required. Atomic layer deposition and the atomic layer etching are the respective techniques used for selective deposition and selective removal. Atomic layer deposition still remains as a futuristic manufacturing approach that deposits materials and lms in exact places. In addition to the nano/microelectronics industry, ALD is also widening its application areas and acceptance. The usage of ALD equipments in industry exhibits a diversi cation trend. With this trend, large area, batch processing, particle ALD and plasma enhanced like ALD equipments are becoming prominent in industrial applications. In this work, the development of an atomic layer deposition tool with microwave plasma capability is described, which is a ordable even for lightly funded research labs.
Resumo:
Työn tavoitteena oli hiilihydraattien ja aminohappojen talteenotto biomassaperäisistä liuoksista erilaisina fraktioina ultra- ja nanosuodattamalla niitä erilaisilla membraaneilla. Työ tehtiin selvitystyönä Senson Oy:lle syystalven 2015 ja kevään 2016 välisenä aikana. Teoriaosassa perehdyttiin nanosuodatukseen ja sen erilaisiin sovelluksiin teollisuudessa, sekä lyhyesti muihin paineavusteisiin membraanisuodatusprosesseihin. Teoriaosassa myös keskityttiin erityisesti nanosuodatuksessa käytettyihin membraaneihin sekä niiden likaantumismekanismeihin. Kokeellisessa osassa keskityttiin hiilihydraattien ja aminohappojen talteenottoon kolmesta biomassaperäisestä liuoksesta. Tutkimuksen osa-alueita olivat ultrasuodatus, ultrasuodatuksen konsentraatin kirkastaminen sekä ultrasuodatuksen permeaatin fraktiointi nanosuodatuksella. Tutkimuksessa kiinnitettiin myös erityistä huomiota suodatuskalvojen likaantumiseen ja peseytyvyyteen sekä kalvojen käytettävyyteen pesujen jälkeen. Ultrasuodatuksessa kaikkien kolmen liuoksen kohdalla tutkittavien hiilihydraattien saanto permeaattiin oli hyvä, noin 90 %. Ultrasuodatuksissa käytettyjen membraanien osalta ei myöskään ollut havaittavissa merkittävää likaantumista. Ultrasuodatuksen konsentraattien kirkastamiskokeissa sameutta aiheuttavat komponentit saatiin poistettua kaikista liuoksista yli 94 %:in tehokkuudella. Nanosuodatuksissa monosakkaridit saatiin erotettua suuremmista hiilihydraattikomponenteista joko täysin tai lähes täysin (97 - 100 %). Nanosuodatuksissa käytettyjen membraanien osalta huomattavaa likaantumista oli havaittavissa vain membraanilla 2. Tulosten perusteella nanosuodatuksen voidaan sanoa olevan tehokas tapa erottaa pienet monosakkaridit suuremmista hiilihydraattiyhdisteistä.
Resumo:
The erosion processes resulting from flow of fluids (gas-solid or liquid-solid) are encountered in nature and many industrial processes. The common feature of these erosion processes is the interaction of the fluid (particle) with its boundary thus resulting in the loss of material from the surface. This type of erosion in detrimental to the equipment used in pneumatic conveying systems. The puncture of pneumatic conveyor bends in industry causes several problems. Some of which are: (1) Escape of the conveyed product causing health and dust hazard; (2) Repairing and cleaning up after punctures necessitates shutting down conveyors, which will affect the operation of the plant, thus reducing profitability. The most common occurrence of process failure in pneumatic conveying systems is when pipe sections at the bends wear away and puncture. The reason for this is particles of varying speed, shape, size and material properties strike the bend wall with greater intensity than in straight sections of the pipe. Currently available models for predicting the lifetime of bends are inaccurate (over predict by 80%. The provision of an accurate predictive method would lead to improvements in the structure of the planned maintenance programmes of processes, thus reducing unplanned shutdowns and ultimately the downtime costs associated with these unplanned shutdowns. This is the main motivation behind the current research. The paper reports on two aspects of the first phases of the study-undertaken for the current project. These are (1) Development and implementation; and (2) Testing of the modelling environment. The model framework encompasses Computational Fluid Dynamics (CFD) related engineering tools, based on Eulerian (gas) and Lagrangian (particle) approaches to represent the two distinct conveyed phases, to predict the lifetime of conveyor bends. The method attempts to account for the effect of erosion on the pipe wall via particle impacts, taking into account the angle of attack, impact velocity, shape/size and material properties of the wall and conveyed material, within a CFD framework. Only a handful of researchers use CFD as the basis of predicting the particle motion, see for example [1-4] . It is hoped that this would lead to more realistic predictions of the wear profile. Results, for two, three-dimensional test cases using the commercially available CFD PHOENICS are presented. These are reported in relation to the impact intensity and sensitivity to the inlet particle distributions.