976 resultados para Industry applications


Relevância:

30.00% 30.00%

Publicador:

Resumo:

O conceito de qualidade de vida surge pela primeira vez em 1920, através do economista inglês Arthur Cecil Pigou, que utiliza este termo para descrever o impacto governamental sobre a vida das pessoas mais desfavorecidas. Com a instalação de uma era industrializada e com o fim da 2º Guerra Mundial, a sociedade mudou de paradigma e iniciou uma procura incessante de formas para melhorar a sua qualidade de vida. Este conceito desenvolve-se juntamente com o desenvolvimento do conceito de educação, saúde, habitação, transporte, trabalho e lazer, bem como indicadores do aumento da esperança de vida, a diminuição da mortalidade infantil e dos níveis de poluição. O avanço da tecnologia teve um papel fundamental para a evolução desses conceitos, bem como o Design na procura de soluções para aplicação dessas mesmas tecnologias. No caso concreto da indústria tèxtil, a tendência é o desenvolvimento de têxteis inteligentes envolvendo a engenharia electrónica no seu processo de conceptualização e de fabrico. A chamada tecnologia wearable abre novos horizontes para a criação de soluções inovadoras, abrindo novos nichos de mercado com elevado valor acrescentado. Existem atualmente vários produtos no mercado cuja funcionalidade e utilidade lhes conferiu um estatuto imutável ao longo dos anos, onde a evolução não avançou na tendência atual. Esse é o caso dos tecidos estreitos, cuja funcionalidade poderá adquirir novas capacidades e ser utilizada em diferentes componentes têxteis nas mais variadas áreas. Essas capacidades poderão ser acrescentadas pela incorporação de materiais com luminosidade (Led’s e L-Wire) nas suas estruturas. Neste estudo realizado o design de produtos com novas funcionalidades, adaptando as tecnologias até agora desenvolvidas em novas soluções e/ou novas recriações de produto.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel approach for tissue engineering applications based on the use of magnetoelectric materials is presented. This work proves that magnetoelectric Terfenol-D/poly(vinylidene fluoride-co-trifluoroethylene) composites are able to provide mechanical and electrical stimuli to MC3T3-E1 pre-osteoblast cells and that those stimuli can be remotely triggered by an applied magnetic field. Cell proliferation is enhanced up to 25% when cells are cultured under mechanical (up to 110 ppm) and electrical stimulation (up to 0.115 mV), showing that magnetoelectric cell stimulation is a novel and suitable approach for tissue engineering allowing magnetic, mechanical and electrical stimuli.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bacteriophages (phages), natural enemies of bacteria, can encode enzymes able to degrade polymeric substances. These substances can be found in the bacterial cell surface, such as polysaccharides, or are produced by bacteria when they are living in biofilm communities, the most common bacterial lifestyle. Consequently, phages with depolymerase activity have a facilitated access to the host receptors, by degrading the capsular polysaccharides, and are believed to have a better performance against bacterial biofilms, since the degradation of extracellular polymeric substances by depolymerases might facilitate the access of phages to the cells within different biofilm layers. Since the diversity of phage depolymerases is not yet fully explored, this is the first review gathering information about all the depolymerases encoded by fully sequenced phages. Overall, in this study, 160 putative depolymerases, including sialidases, levanases, xylosidases, dextranases, hyaluronidases, peptidases as well as pectate/pectin lyases, were found in 143 phages (43 Myoviridae, 47 Siphoviridae, 37 Podoviridae, and 16 unclassified) infecting 24 genera of bacteria. We further provide information about the main applications of phage depolymerases, which can comprise areas as diverse as medical, chemical, or food-processing industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of agro-industrial organic wastes in the environment can be reduced when used in agriculture. From the standpoint of soil fertility, residue applications can increase the organic matter content and provide nutrients for plants. This study evaluated the effect of biological sludge from gelatin industry on the chemical properties of two Ultisols (loamy sand and sandy clay) and an Oxisol (clay). The experiment lasted 120 days and was carried out in laboratory in a completely randomized design with factorial arrangement, combining the three soils and six biological sludge rates (0, 100, 200, 300, 400, and 500 m³ ha-1), with three replications. Biological sludge rates of up to 500 m³ ha-1 decreased soil acidity and increased the effective cation exchange capacity (CEC) and N, Ca, Mg, and P availability, without exceeding the tolerance limit for Na. The increase in exchangeable base content, greater than the effective CEC, indicates that the major part of cations added by the sludge remains in solution and can be lost by leaching.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surfactants are used as additives in topical pharmaceuticals and drug delivery systems. The biocompatibility of amino acid-based surfactants makes them highly suitable for use in these fields, but tests are needed to evaluate their potential toxicity. Here we addressed the sensitivity of tumor (HeLa, MCF-7) and non-tumor (3T3, 3T6, HaCaT, NCTC 2544) cell lines to the toxic effects of lysine-based surfactants by means of two in vitro endpoints (MTT and NRU). This comparative assay may serve as a reliable approach for predictive toxicity screening of chemicals prior to pharmaceutical applications. After 24-h of cell exposure to surfactants, differing toxic responses were observed. NCTC 2544 and 3T6 cell lines were the most sensitive, while both tumor cells and 3T3 fibroblasts were more resistant to the cytotoxic effects of surfactants. IC50-values revealed that cytotoxicity was detected earlier by MTT assay than by NRU assay, regardless of the compound or cell line. The overall results showed that surfactants with organic counterions were less cytotoxic than those with inorganic counterions. Our findings highlight the relevance of the correct choice and combination of cell lines and bioassays in toxicity studies for a safe and reliable screen of chemicals with potential interest in pharmaceutical industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A large number of applications using manufactured nanoparticles of less than 100 nm are currently being introduced into industrial processes. There is an urgent need to evaluate the risks of these novel particles to ensure their safe production, handling, use, and disposal. However, today we lack even rudimentary knowledge about type and quantity of industrially used manufactured nanoparticles and the level of exposure in Swiss industry. The goal of this study was to evaluate the use of nanoparticles, the currently implemented safety measures, and the number of potentially exposed workers in all types of industry. To evaluate this, a targeted telephone survey was conducted among health and safety representatives from 197 Swiss companies. The survey showed that nanoparticles are already used in many industrial sectors; not only in companies in the new field of nanotechnology, but also in more traditional sectors, such as paints. Forty-three companies declared to use or produce nanoparticles, and 11 imported and traded with prepackaged goods that contain nanoparticles. The following nanoparticles were found to be used in considerable quantities (> 1000 kg/year per company): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO. The median reported quantity of handled nanoparticles was 100 kg/year. The production of cosmetics, food, paints, powders, and the treatment of surfaces used the largest quantities of these nanoparticles. Generally, the safety measures were found to be higher in powder-based than in liquid-based applications. However, the respondents had many open questions about best practices, which points to the need for rapid development of guidelines and protection strategies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Metric Training For The Highway Industry", HR-376 was designed to produce training materials for the various divisions of the Iowa DOT, local government and the highway construction industry. The project materials were to be used to introduce the highway industry in Iowa to metric measurements in their daily activities. Five modules were developed and used in training over 1,000 DOT, county, city, consultant and contractor staff in the use of metric measurements. The training modules developed deal with the planning through operation areas of highway transportation. The materials and selection of modules were developed with the aid of an advisory personnel from the highway industry. Each module is design as a four hour block of instruction and a stand along module for specific types of personnel. Each module is subdivided into four chapters with chapter one and four covering general topics common to all subjects. Chapters two and three are aimed at hands on experience for a specific group and subject. This module includes: Module 2 - Construction and Maintenance Operations and Reporting. This module provides hands on examples of applications of metric measurements in the construction and maintenance field operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Metric Training For The Highway Industry", HR-376 was designed to produce training materials for the various divisions of the Iowa DOT, local government and the highway construction industry. The project materials were to be used to introduce the highway industry in Iowa to metric measurements in their daily activities. Five modules were developed and used in training over 1,000 DOT, county, city, consultant and contractor staff in the use of metric measurements. The training modules developed deal with the planning through operation areas of highway transportation. The materials and selection of modules were developed with the aid of an advisory personnel from the highway industry. Each module is design as a four hour block of instruction and a stand along module for specific types of personnel. Each module is subdivided into four chapters with chapter one and four covering general topics common to all subjects. Chapters two and three are aimed at hands on experience for a specific group and subject. This module includes: Module 4 - Transportation Planning and Traffic Monitoring. Hands on examples of applications of metric measurements in the development of planning reports and traffic data collection are included in this module.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ionising radiation (IR) applications are quiet common among several areas of knowledge, medicine or industry. Medical X-rays, Nuclear Medicine, Xrays used in non-destructive testing or applications in research are a few examples. These radiations originate from radioactive materials or radiation emitting devices. Radiation Protection education and training (E&T) is of paramount importance to work safely in areas that imply the use of IR. TheTechnical Unit for Radiation Protection at the University of Barcelona has anextensive expertise in basic, initial and refresher training, in general or specificareas, as well as in courses validated by the Spanish Nuclear Safety Council orto satisfy specific needs with bespoke courses. These specific customer needsare evaluated and on-site courses can also be carried out.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally accepted that high density polyethylene pipe (HDPE) performs well under live loads with shallow cover, provided the backfill is well compacted. Although industry standards require carefully compacted backfill, poor inspection and/or faulty construction may result in soils that provide inadequate restraint at the springlines of the pipes thereby causing failure. The objectives of this study were: 1) to experimentally define a lower limit of compaction under which the pipes perform satisfactorily, 2) to quantify the increase in soil support as compaction effort increases, 3) to evaluate pipe response for loads applied near the ends of the buried pipes, 4) to determine minimum depths of cover for a variety of pipes and soil conditions by analytically expanding the experimental results through the use of the finite element program CANDE. The test procedures used here are conservative especially for low-density fills loaded to high contact stresses. The failures observed in these tests were the combined effect of soil bearing capacity at the soil surface and localized wall bending of the pipes. Under a pavement system, the pipes' performance would be expected to be considerably better. With those caveats, the following conclusions are drawn from this study. Glacial till compacted to 50% and 80% provides insufficient support; pipe failureoccurs at surface contact stresses lower than those induced by highway trucks. On the other hand, sand backfill compacted to more than 110 pcf (17.3 kN/m3) is satisfactory. The failure mode for all pipes with all backfills is localized wall bending. At moderate tire pressures, i.e. contact stresses, deflections are reduced significantly when backfill density is increased from about 50 pcf (7.9 kN/m^3) to 90 pcf (14.1 kN/m^3). Above that unit weight, little improvement in the soil-pipe system is observed. Although pipe stiffness may vary as much as 16%, analyses show that backfill density is more important than pipe stiffness in controlling both deflections at low pipe stresses and at the ultimate capacity of the soil-pipe system. The rate of increase in ultimate strength of the system increases nearly linearly with increasing backfill density. When loads equivalent to moderate tire pressures are applied near the ends of the pipes, pipe deflections are slighly higher than when loaded at the center. Except for low density glacial till, the deflections near the ends are not excessive and the pipes perform satisfactorily. For contact stresses near the upper limit of truck tire pressures and when loaded near the end, pipes fail with localized wall bending. For flowable fill backfill, the ultimate capacity of the pipes is nearly doubled and at the upper limit of highway truck tire pressures, deflections are negligible. All pipe specimens tested at ambient laboratory room temperatures satisfied AASHTO minimum pipe stiffness requirements at 5% deflection. However, nearly all specimens tested at elevated pipe surface temperatures, approximately 122°F (50°C), failed to meet these requirements. Some HDPE pipe installations may not meet AASHTO minimum pipe stiffness requirements when installed in the summer months (i.e. if pipe surface temperatures are allowed to attain temperatures similar to those tested here). Heating of any portion of the pipe circumference reduced the load carrying capacity of specimens. The minimum soil cover depths, determined from the CANOE analysis, are controlled by the 5% deflection criterion. The minimum soil cover height is 12 in. (305 mm). Pipes with the poor silt and clay backfills with less than 85% compaction require a minimum soil cover height of 24 in. (610 mm). For the sand at 80% compaction, the A36 HDPE pipe with the lowest moment of inertia requires a minimum of 24 in. (610 mm) soil cover. The C48 HDPE pipe with the largest moment of inertia and all other pipes require a 12 in. (305 mm) minimum soil cover.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tehoelektoniikkalaitteella tarkoitetaan ohjaus- ja säätöjärjestelmää, jolla sähköä muokataan saatavilla olevasta muodosta haluttuun uuteen muotoon ja samalla hallitaan sähköisen tehon virtausta lähteestä käyttökohteeseen. Tämä siis eroaa signaalielektroniikasta, jossa sähköllä tyypillisesti siirretään tietoa hyödyntäen eri tiloja. Tehoelektroniikkalaitteita vertailtaessa katsotaan yleensä niiden luotettavuutta, kokoa, tehokkuutta, säätötarkkuutta ja tietysti hintaa. Tyypillisiä tehoelektroniikkalaitteita ovat taajuudenmuuttajat, UPS (Uninterruptible Power Supply) -laitteet, hitsauskoneet, induktiokuumentimet sekä erilaiset teholähteet. Perinteisesti näiden laitteiden ohjaus toteutetaan käyttäen mikroprosessoreja, ASIC- (Application Specific Integrated Circuit) tai IC (Intergrated Circuit) -piirejä sekä analogisia säätimiä. Tässä tutkimuksessa on analysoitu FPGA (Field Programmable Gate Array) -piirien soveltuvuutta tehoelektroniikan ohjaukseen. FPGA-piirien rakenne muodostuu erilaisista loogisista elementeistä ja niiden välisistä yhdysjohdoista.Loogiset elementit ovat porttipiirejä ja kiikkuja. Yhdysjohdot ja loogiset elementit ovat piirissä kiinteitä eikä koostumusta tai lukumäärää voi jälkikäteen muuttaa. Ohjelmoitavuus syntyy elementtien välisistä liitännöistä. Piirissä on lukuisia, jopa miljoonia kytkimiä, joiden asento voidaan asettaa. Siten piirin peruselementeistä voidaan muodostaa lukematon määrä erilaisia toiminnallisia kokonaisuuksia. FPGA-piirejä on pitkään käytetty kommunikointialan tuotteissa ja siksi niiden kehitys on viime vuosina ollut nopeaa. Samalla hinnat ovat pudonneet. Tästä johtuen FPGA-piiristä on tullut kiinnostava vaihtoehto myös tehoelektroniikkalaitteiden ohjaukseen. Väitöstyössä FPGA-piirien käytön soveltuvuutta on tutkittu käyttäen kahta vaativaa ja erilaista käytännön tehoelektroniikkalaitetta: taajuudenmuuttajaa ja hitsauskonetta. Molempiin testikohteisiin rakennettiin alan suomalaisten teollisuusyritysten kanssa soveltuvat prototyypit,joiden ohjauselektroniikka muutettiin FPGA-pohjaiseksi. Lisäksi kehitettiin tätä uutta tekniikkaa hyödyntävät uudentyyppiset ohjausmenetelmät. Prototyyppien toimivuutta verrattiin vastaaviin perinteisillä menetelmillä ohjattuihin kaupallisiin tuotteisiin ja havaittiin FPGA-piirien mahdollistaman rinnakkaisen laskennantuomat edut molempien tehoelektroniikkalaitteiden toimivuudessa. Työssä on myösesitetty uusia menetelmiä ja työkaluja FPGA-pohjaisen säätöjärjestelmän kehitykseen ja testaukseen. Esitetyillä menetelmillä tuotteiden kehitys saadaan mahdollisimman nopeaksi ja tehokkaaksi. Lisäksi työssä on kehitetty FPGA:n sisäinen ohjaus- ja kommunikointiväylärakenne, joka palvelee tehoelektroniikkalaitteiden ohjaussovelluksia. Uusi kommunikointirakenne edistää lisäksi jo tehtyjen osajärjestelmien uudelleen käytettävyyttä tulevissa sovelluksissa ja tuotesukupolvissa.