957 resultados para VeriStand, Custom devices, Hardware in the loop, LabView, FPGA, ECU


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decades the automotive sector has seen a technological revolution, due mainly to the more restrictive regulation, the newly introduced technologies and, as last, to the poor resources of fossil fuels remaining on Earth. Promising solution in vehicles’ propulsion are represented by alternative architectures and energy sources, for example fuel-cells and pure electric vehicles. The automotive transition to new and green vehicles is passing through the development of hybrid vehicles, that usually combine positive aspects of each technology. To fully exploit the powerful of hybrid vehicles, however, it is important to manage the powertrain’s degrees of freedom in the smartest way possible, otherwise hybridization would be worthless. To this aim, this dissertation is focused on the development of energy management strategies and predictive control functions. Such algorithms have the goal of increasing the powertrain overall efficiency and contextually increasing the driver safety. Such control algorithms have been applied to an axle-split Plug-in Hybrid Electric Vehicle with a complex architecture that allows more than one driving modes, including the pure electric one. The different energy management strategies investigated are mainly three: the vehicle baseline heuristic controller, in the following mentioned as rule-based controller, a sub-optimal controller that can include also predictive functionalities, referred to as Equivalent Consumption Minimization Strategy, and a vehicle global optimum control technique, called Dynamic Programming, also including the high-voltage battery thermal management. During this project, different modelling approaches have been applied to the powertrain, including Hardware-in-the-loop, and diverse powertrain high-level controllers have been developed and implemented, increasing at each step their complexity. It has been proven the potential of using sophisticated powertrain control techniques, and that the gainable benefits in terms of fuel economy are largely influenced by the chose energy management strategy, even considering the powerful vehicle investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of an all-optical communications infrastructure requires appropriate optical switching devices and supporting hardware. This thesis presents several novel fibre lasers which are useful pulse sources for high speed optical data processing and communications. They share several attributes in common: flexibility, stability and low-jitter output. They all produce short (picosecond) and are suitable as sources for soliton systems. The lasers are all-fibre systems using erbium-doped fibre for gain, and are actively-modelocked using a dual-wavelength nonlinear optical loop mirror (NOLM) as a modulator. Control over the operating wavelength and intra-cavity dispersion is obtained using a chirped in-fibre Bragg grating.Systems operating both at 76MHz and gigahertz frequencies are presented, the latter using a semiconductor laser amplifier to enhance nonlinear action in the loop mirror. A novel dual-wavelength system in which two linear cavities share a common modulator is presented with results which show that the jitter between the two wavelengths is low enough for use in switching experiments with data rates of up to 130Gbit/s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La Fibrose Kystique (FK) est une maladie dégénérative qui entraine une dégénération des poumons dû au problème de clairance mucociliaire (CMC). Le volume de surface liquide (SL) couvrant les cellules pulmonaires est essentiel à la clairance de mucus et au combat contre les infections. Les nucléotides extracellulaires jouent un rôle important dans la CMC des voies aériennes, en modifiant le volume de la SL pulmonaire. Cependant, les mécanismes du relâchement de l’ATP et de leurs déplacements à travers la SL, restent inconnus. Des études ultérieures démontrent que l’exocytose d’ATP mécano-sensible et Ca2+-dépendant, dans les cellules A549, est amplifié par les actions synergétiques autocrine/paracrine des cellules avoisinantes. Nous avions comme but de confirmer la présence de la boucle purinergique dans plusieurs modèles de cellules épithéliales et de développer un système nous permettant d’observer directement la SL. Nous avons démontrés que la boucle purinergique est fonctionnelle dans les modèles de cellules épithéliales examinés, mis appart les cellules Calu-3. L’utilisation de modulateur de la signalisation purinergique nous a permis d’observer que le relâchement d’ATP ainsi que l’augmentation du [Ca2+]i suivant un stress hypotonique, sont modulés par le biais de cette boucle purinergique et des récepteurs P2Y. De plus, nous avons développé un système de microscopie qui permet d’observer les changements de volume de SL en temps réel. Notre système permet de contrôler la température et l’humidité de l’environnement où se trouvent les cellules, reproduisant l’environnement pulmonaire humain. Nous avons démontré que notre système peut identifier même les petits changements de volume de SL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the significant increase of population and their natural desire of improving their standard of living, usage of energy extracted from world commodities, especially shaped as electricity, has increased in an intense manner during the last decades. This fact brings up a challenge with a complicated solution, which is how to guarantee that there will be enough energy so as to satisfy the energy demand of the world population. Among all the possible solutions that can be adopted to mitigate this problem one of them is almost of mandatory adoption, which consists of rationalizing energy utilization, in a way that its wasteful usage is minimized and it can be leveraged during a longer period of time. One of the ways to achieve it is by means of the improvement of the power distribution grid, so that it will be able to react in a more efficient manner against common issues, such as energy demand peaks or inaccurate electricity consumption forecasts. However, in order to be able to implement this improvement it is necessary to use technologies from the ICT (Information and Communication Technologies) sphere that often present challenges in some key areas: advanced metering infrastructure integration, interoperability and interconnectivity of the devices, interfaces to offer the applications, security measures design, etc. All these challenges may imply slowing down the adoption of the smart grid as a system to prolong the lifespan and utilization of the available energy. A proposal for an intermediation architecture that will make possible solving these challenges is put forward in this Master Thesis. Besides, one implementation and the tests that have been carried out to know the performance of the presented concepts have been included as well, in a way that it can be proved that the challenges set out by the smart grid can be resolved. RESUMEN. Debido al incremento significativo de la población y su deseo natural de mejorar su nivel de vida, la utilización de la energía extraída de las materias primas mundiales, especialmente en forma de electricidad, ha aumentado de manera intensa durante las últimas décadas. Este hecho plantea un reto de solución complicada, el cual es cómo garantizar que se dispondrá de la energía suficiente como para satisfacer la demanda energética de la población mundial. De entre todas las soluciones posibles que se pueden adoptar para mitigar este problema una de ellas es de casi obligatoria adopción, la cual consiste en racionalizar la utilización de la energía, de tal forma que se minimice su malgasto y pueda aprovecharse durante más tiempo. Una de las maneras de conseguirlo es mediante la mejora de la red de distribución de electricidad para que ésta pueda reaccionar de manera más eficaz contra problemas comunes, tales como los picos de demanda de energía o previsiones imprecisas acerca del consumo de electricidad. Sin embargo, para poder implementar esta mejora es necesario utilizar tecnologías del ámbito de las TIC (Tecnologías de la Información y la Comunicación) que a menudo presentan problemas en algunas áreas clave: integración de infraestructura de medición avanzada, interoperabilidad e interconectividad de los dispositivos, interfaces que ofrecer a las aplicaciones, diseño de medidas de seguridad, etc. Todos estos retos pueden implicar una ralentización en la adopción de la red eléctrica inteligente como un sistema para alargar la vida y la utilización de la energía disponible. En este Trabajo Fin de Máster se sugiere una propuesta para una arquitectura de intermediación que posibilite la resolución de estos retos. Además, una implementación y las pruebas que se han llevado a cabo para conocer el rendimiento de los conceptos presentados también han sido incluidas, de tal forma que se demuestre que los retos que plantea la red eléctrica inteligente pueden ser solventados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes an approach of optimal sensitivity applied in the tertiary loop of the automatic generation control. The approach is based on the theorem of non-linear perturbation. From an optimal operation point obtained by an optimal power flow a new optimal operation point is directly determined after a perturbation, i.e., without the necessity of an iterative process. This new optimal operation point satisfies the constraints of the problem for small perturbation in the loads. The participation factors and the voltage set point of the automatic voltage regulators (AVR) of the generators are determined by the technique of optimal sensitivity, considering the effects of the active power losses minimization and the network constraints. The participation factors and voltage set point of the generators are supplied directly to a computational program of dynamic simulation of the automatic generation control, named by power sensitivity mode. Test results are presented to show the good performance of this approach. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we studied the mixture of poly(3,4-ethylenedioxythiophene): poly(styrenesulfonate) (PEDOT:PSS), a commercial polymer, with monobasic potassium phosphate (KDP), a piezoelectric salt, as a possible novel material in the fabrication of a low cost, easy-to-make,flexible pressure sensing device. The mixture between KDP and PEDOT: PSS was painted in a flexible polyester substrate and dried. Afterwards, I x V curves were carried out. The samples containing KDP presented higher values of current in smaller voltages than the PEDOT: PSS without KDP. This can mean a change in the chain arrays. Other results showed that the material responds to directly applied pressure to the sample that can be useful to sensors fabrication. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mutations in the extracellular M2-M3 loop of the glycine receptor (GlyR) alpha1 subunit have been shown previously to affect channel gating. In this study, the substituted cysteine accessibility method was used to investigate whether a structural rearrangement of the M2-M3 loop accompanies GlyR activation. All residues from R271C to V277C were covalently modified by both positively charged methanethiosulfonate ethyltrimethylammonium (MTSET) and negatively charged methanethiosulfonate ethylsulfonate (MTSES), implying that these residues form an irregular surface loop. The MTSET modification rate of all residues from R271C to K276C was faster in the glycine-bound state than in the unliganded state. MTSES modification of A272C, L274C, and V277C was also faster in the glycine-bound state. These results demonstrate that the surface accessibility of the M2-M3 loop is increased as the channel transitions from the closed to the open state, implying that either the loop itself or an overlying domain moves during channel activation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent evidence has emerged that peroxisome proliferator-activated receptor alpha (PPARalpha), which is largely involved in lipid metabolism, can play an important role in connecting circadian biology and metabolism. In the present study, we investigated the mechanisms by which PPARalpha influences the pacemakers acting in the central clock located in the suprachiasmatic nucleus and in the peripheral oscillator of the liver. We demonstrate that PPARalpha plays a specific role in the peripheral circadian control because it is required to maintain the circadian rhythm of the master clock gene brain and muscle Arnt-like protein 1 (bmal1) in vivo. This regulation occurs via a direct binding of PPARalpha on a potential PPARalpha response element located in the bmal1 promoter. Reversely, BMAL1 is an upstream regulator of PPARalpha gene expression. We further demonstrate that fenofibrate induces circadian rhythm of clock gene expression in cell culture and up-regulates hepatic bmal1 in vivo. Together, these results provide evidence for an additional regulatory feedback loop involving BMAL1 and PPARalpha in peripheral clocks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glucose control is the cornerstone of Diabetes Mellitus (DM) treatment. Although self-regulation using capillary glycemia (SRCG) still remains the best procedure in clinical practice, continuous glucose monitoring systems (CGM) offer the possibility of continuous and dynamic assessment of interstitial glucose concentration. CGM systems have the potential to improve glycemic control while decreasing the incidence of hypoglycemia but the efficiency, compared with SRCG, is still debated. CGM systems have the greatest potential value in patients with hypoglycemic unawareness and in controlling daily fluctuations in blood glucose. The implementation of continuous monitoring in the standard clinical setting has not yet been established but a new generation of open and close loop subcutaneous insulin infusion devices are emerging making insulin treatment and glycemic control more reliable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To compare the prediction of hip fracture risk of several bone ultrasounds (QUS), 7062 Swiss women > or =70 years of age were measured with three QUSs (two of the heel, one of the phalanges). Heel QUSs were both predictive of hip fracture risk, whereas the phalanges QUS was not. INTRODUCTION: As the number of hip fracture is expected to increase during these next decades, it is important to develop strategies to detect subjects at risk. Quantitative bone ultrasound (QUS), an ionizing radiation-free method, which is transportable, could be interesting for this purpose. MATERIALS AND METHODS: The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk (SEMOF) study is a multicenter cohort study, which compared three QUSs for the assessment of hip fracture risk in a sample of 7609 elderly ambulatory women > or =70 years of age. Two QUSs measured the heel (Achilles+; GE-Lunar and Sahara; Hologic), and one measured the heel (DBM Sonic 1200; IGEA). The Cox proportional hazards regression was used to estimate the hazard of the first hip fracture, adjusted for age, BMI, and center, and the area under the ROC curves were calculated to compare the devices and their parameters. RESULTS: From the 7609 women who were included in the study, 7062 women 75.2 +/- 3.1 (SD) years of age were prospectively followed for 2.9 +/- 0.8 years. Eighty women reported a hip fracture. A decrease by 1 SD of the QUS variables corresponded to an increase of the hip fracture risk from 2.3 (95% CI, 1.7, 3.1) to 2.6 (95% CI, 1.9, 3.4) for the three variables of Achilles+ and from 2.2 (95% CI, 1.7, 3.0) to 2.4 (95% CI, 1.8, 3.2) for the three variables of Sahara. Risk gradients did not differ significantly among the variables of the two heel QUS devices. On the other hand, the phalanges QUS (DBM Sonic 1200) was not predictive of hip fracture risk, with an adjusted hazard risk of 1.2 (95% CI, 0.9, 1.5), even after reanalysis of the digitalized data and using different cut-off levels (1700 or 1570 m/s). CONCLUSIONS: In this elderly women population, heel QUS devices were both predictive of hip fracture risk, whereas the phalanges QUS device was not.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is necessary to use highly specialized robots in ITER (International Thermonuclear Experimental Reactor) both in the manufacturing and maintenance of the reactor due to a demanding environment. The sectors of the ITER vacuum vessel (VV) require more stringent tolerances than normally expected for the size of the structure involved. VV consists of nine sectors that are to be welded together. The vacuum vessel has a toroidal chamber structure. The task of the designed robot is to carry the welding apparatus along a path with a stringent tolerance during the assembly operation. In addition to the initial vacuum vessel assembly, after a limited running period, sectors need to be replaced for repair. Mechanisms with closed-loop kinematic chains are used in the design of robots in this work. One version is a purely parallel manipulator and another is a hybrid manipulator where the parallel and serial structures are combined. Traditional industrial robots that generally have the links actuated in series are inherently not very rigid and have poor dynamic performance in high speed and high dynamic loading conditions. Compared with open chain manipulators, parallel manipulators have high stiffness, high accuracy and a high force/torque capacity in a reduced workspace. Parallel manipulators have a mechanical architecture where all of the links are connected to the base and to the end-effector of the robot. The purpose of this thesis is to develop special parallel robots for the assembly, machining and repairing of the VV of the ITER. The process of the assembly and machining of the vacuum vessel needs a special robot. By studying the structure of the vacuum vessel, two novel parallel robots were designed and built; they have six and ten degrees of freedom driven by hydraulic cylinders and electrical servo motors. Kinematic models for the proposed robots were defined and two prototypes built. Experiments for machine cutting and laser welding with the 6-DOF robot were carried out. It was demonstrated that the parallel robots are capable of holding all necessary machining tools and welding end-effectors in all positions accurately and stably inside the vacuum vessel sector. The kinematic models appeared to be complex especially in the case of the 10-DOF robot because of its redundant structure. Multibody dynamics simulations were carried out, ensuring sufficient stiffness during the robot motion. The entire design and testing processes of the robots appeared to be complex tasks due to the high specialization of the manufacturing technology needed in the ITER reactor, while the results demonstrate the applicability of the proposed solutions quite well. The results offer not only devices but also a methodology for the assembly and repair of ITER by means of parallel robots.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of this work is to provide authentication and confidentiality of messages in a swift and cost effective manner to suit the fast growing Internet applications. A nested hash function with lower computational and storage demands is designed with a view to providing authentication as also to encrypt the message as well as the hash code using a fast stream cipher MAJE4 with a variable key size of 128-bit or 256-bit for achieving confidentiality. Both nested Hash function and MAJE4 stream cipher algorithm use primitive computational operators commonly found in microprocessors; this makes the method simple and fast to implement both in hardware and software. Since the memory requirement is less, it can be used for handheld devices for security purposes.