978 resultados para New high
Resumo:
Laser cutting implementation possibilities into paper making machine was studied as the main objective of the work. Laser cutting technology application was considered as a replacement tool for conventional cutting methods used in paper making machines for longitudinal cutting such as edge trimming at different paper making process and tambour roll slitting. Laser cutting of paper was tested in 70’s for the first time. Since then, laser cutting and processing has been applied for paper materials with different level of success in industry. Laser cutting can be employed for longitudinal cutting of paper web in machine direction. The most common conventional cutting methods include water jet cutting and rotating slitting blades applied in paper making machines. Cutting with CO2 laser fulfils basic requirements for cutting quality, applicability to material and cutting speeds in all locations where longitudinal cutting is needed. Literature review provided description of advantages, disadvantages and challenges of laser technology when it was applied for cutting of paper material with particular attention to cutting of moving paper web. Based on studied laser cutting capabilities and problem definition of conventional cutting technologies, preliminary selection of the most promising application area was carried out. Laser cutting (trimming) of paper web edges in wet end was estimated to be the most promising area where it can be implemented. This assumption was made on the basis of rate of web breaks occurrence. It was found that up to 64 % of total number of web breaks occurred in wet end, particularly in location of so called open draws where paper web was transferred unsupported by wire or felt. Distribution of web breaks in machine cross direction revealed that defects of paper web edge was the main reason of tearing initiation and consequent web break. The assumption was made that laser cutting was capable of improvement of laser cut edge tensile strength due to high cutting quality and sealing effect of the edge after laser cutting. Studies of laser ablation of cellulose supported this claim. Linear energy needed for cutting was calculated with regard to paper web properties in intended laser cutting location. Calculated linear cutting energy was verified with series of laser cutting. Practically obtained laser energy needed for cutting deviated from calculated values. This could be explained by difference in heat transfer via radiation in laser cutting and different absorption characteristics of dry and moist paper material. Laser cut samples (both dry and moist (dry matter content about 25-40%)) were tested for strength properties. It was shown that tensile strength and strain break of laser cut samples are similar to corresponding values of non-laser cut samples. Chosen method, however, did not address tensile strength of laser cut edge in particular. Thus, the assumption of improving strength properties with laser cutting was not fully proved. Laser cutting effect on possible pollution of mill broke (recycling of trimmed edge) was carried out. Laser cut samples (both dry and moist) were tested on the content of dirt particles. The tests revealed that accumulation of dust particles on the surface of moist samples can take place. This has to be taken into account to prevent contamination of pulp suspension when trim waste is recycled. Material loss due to evaporation during laser cutting and amount of solid residues after cutting were evaluated. Edge trimming with laser would result in 0.25 kg/h of solid residues and 2.5 kg/h of lost material due to evaporation. Schemes of laser cutting implementation and needed laser equipment were discussed. Generally, laser cutting system would require two laser sources (one laser source for each cutting zone), set of beam transfer and focusing optics and cutting heads. In order to increase reliability of system, it was suggested that each laser source would have double capacity. That would allow to perform cutting employing one laser source working at full capacity for both cutting zones. Laser technology is in required level at the moment and do not require additional development. Moreover, capacity of speed increase is high due to availability high power laser sources what can support the tendency of speed increase of paper making machines. Laser cutting system would require special roll to maintain cutting. The scheme of such roll was proposed as well as roll integration into paper making machine. Laser cutting can be done in location of central roll in press section, before so-called open draw where many web breaks occur, where it has potential to improve runability of a paper making machine. Economic performance of laser cutting was done as comparison of laser cutting system and water jet cutting working in the same conditions. It was revealed that laser cutting would still be about two times more expensive compared to water jet cutting. This is mainly due to high investment cost of laser equipment and poor energy efficiency of CO2 lasers. Another factor is that laser cutting causes material loss due to evaporation whereas water jet cutting almost does not cause material loss. Despite difficulties of laser cutting implementation in paper making machine, its implementation can be beneficial. The crucial role in that is possibility to improve cut edge strength properties and consequently reduce number of web breaks. Capacity of laser cutting to maintain cutting speeds which exceed current speeds of paper making machines what is another argument to consider laser cutting technology in design of new high speed paper making machines.
Resumo:
Tämä työ on kirjallisuuskatsaus digitaalisen teräväpiirtoresoluution historiaan, nykyhetkeen ja tulevaisuuteen. Lisäksi käydään läpi eri medioista löytyviä termejä ja pyritään selvittämään näiden termien merkitys lukijalle. Työ on tehty Lappeenrannan teknillisen yliopiston Tietotekniikan osastolle. Teräväpiirtotelevision historia alkaa jo 1960-luvulta ja ensimmäiset teräväpiirtoresoluutiot kehitettiin 1970-luvulla. Kun resoluutiota kasvatetaan, myös kuvatiedoston ja vaadittavan tallennustilan koko kasvaa. Se aiheuttaa uusia haasteita muun muassa lähetys-, pakkaus- ja vastaanotintekniikoille. 4K-resoluutiot ovat jo täällä, mutta miten käy 16K-resoluution. Onko suuresta resoluutiosta hyötyä esimerkiksi Virtual Reality –sovelluksissa?
Resumo:
Une nouvelle méthode d'extraction en phase solide (SPE) couplée à une technique d'analyse ultrarapide a été développée pour la détermination simultanée de neuf contaminants émergents (l'atrazine, le déséthylatrazine, le 17(béta)-estradiol, l'éthynylestradiol, la noréthindrone, la caféine, la carbamazépine, le diclofénac et le sulfaméthoxazole) provenant de différentes classes thérapeutiques et présents dans les eaux usées. La pré-concentration et la purification des échantillons a été réalisée avec une cartouche SPE en mode mixte (Strata ABW) ayant à la fois des propriétés échangeuses de cations et d'anions suivie d'une analyse par une désorption thermique par diode laser/ionisation chimique à pression atmosphérique couplée à la spectrométrie de masse en tandem (LDTD-APCI-MS/MS). La LDTD est une nouvelle méthode d'introduction d'échantillon qui réduit le temps total d'analyse à moins de 15 secondes par rapport à plusieurs minutes avec la chromatographie liquide couplée à la spectrométrie de masse en tandem traditionnelle (LC-MS/MS). Plusieurs paramètres SPE ont été évalués dans le but d'optimiser l'efficacité de récupération lors de l'extraction des analytes provenant des eaux usées, tels que la nature de la phase stationnaire, le débit de chargement, le pH d'extraction, le volume et la composition de la solution de lavage et le volume de l'échantillon initial. Cette nouvelle méthode a été appliquée avec succès à de vrais échantillons d'eaux usées provenant d'un réservoir de décantation primaire. Le recouvrement des composés ciblés provenant des eaux usées a été de 78 à 106%, la limite de détection a été de 30 à 122 ng L-1, alors que la limite de quantification a été de 88 à 370 ng L-1. Les courbes d'étalonnage dans les matrices d'eaux usées ont montré une bonne linéarité (R2 > 0,991) pour les analytes cibles ainsi qu’une précision avec un coefficient de variance inférieure à 15%.
Resumo:
La littérature abordant les enjeux socio-éthiques et réglementaires associés aux médicaments est relativement abondante, ce qui n’est pas le cas des dispositifs médicaux (DM). Ce dernier secteur couvre une très large diversité de produits qui servent à de multiples applications: diagnostic, traitement, gestion des symptômes de certaines conditions physiques ou psychiatriques, restauration d’une fonction débilitante, chirurgie, etc. À tort, on a tendance à croire que les DM sont réglementés de la même manière que les médicaments, que ce soit pour les exigences concernant leur mise en marché ou des pratiques de surveillance après mise en marché. Or, au cours des dernières années, leur usage élargi, leur impact sur les coûts des soins de santé, et les rappels majeurs dont certains ont fait l’objet ont commencé à inquiéter la communauté médicale et de nombreux chercheurs. Ils interpellent les autorités réglementaires à exercer une plus grande vigilance tant au niveau de l’évaluation des nouveaux DM à risque élevé avant leur mise en marché, que dans les pratiques de surveillance après mise en marché. Une stratégie plus rigoureuse d’évaluation des nouveaux DM permettrait d’assurer un meilleur suivi des risques associés à leur utilisation, de saisir la portée des divers enjeux socio-éthiques découlant de l’utilisation de certains DM, et de préserver la confiance du public. D’emblée, il faut savoir que les autorités nationales n’ont pas pour mandat d’évaluer la portée des enjeux socio-éthiques, ou encore les coûts des DM qui font l’objet d’une demande de mise en marché. Cette évaluation est essentiellement basée sur une analyse des rapports risques-bénéfices générés par l’usage du DM pour une indication donnée. L’évaluation des impacts socio-éthiques et l’analyse coûts-bénéfices relèvent des agences d’Évaluation des technologies de santé (ÉTS). Notre recherche montre que les DM sont non seulement peu fréquemment évalués par les agences d’ÉTS, mais l’examen des enjeux socio-éthiques est trop souvent encore incomplet. En fait, les recommandations des rapports d’ÉTS sont surtout fondées sur une analyse coûts-bénéfices. Or, le secteur des DM à risque élevé est particulièrement problématique. Plusieurs sont non seulement porteurs de risques pour les patients, mais leur utilisation élargie comporte des impacts importants pour les systèmes de santé. Nous croyons que le Principisme, au cœur de l’éthique biomédicale, que ce soit au plan de l’éthique de la recherche que de l’éthique clinique, constitue un outil pour faciliter la reconnaissance et l’examen, particulièrement par les agences d’ÉTS, des enjeux socio-éthiques en jeu au niveau des DM à risque élevé. Également, le Principe de Précaution pourrait aussi servir d’outil, particulièrement au sein des agences nationales de réglementation, pour mieux cerner, reconnaître, analyser et gérer les risques associés à l’évaluation et l’utilisation de ce type de DM. Le Principisme et le Principe de Précaution pourraient servir de repères 1) pour définir les mesures nécessaires pour éliminer les lacunes observées dans pratiques associées aux processus de réglementation, et 2) pour mieux cerner et documenter les enjeux socio-éthiques spécifiques aux DM à risque élevé.
Resumo:
The aim of the investigation is to develop new high performance adhesive systems based on neoprene-phenolic blends. Initially the effect of addition of all possible ingredients like fillers, adhesion promoters, curing agents and their optimum compositions to neoprene solution is investigated. The phenolic resin used is a copolymer of phenol-cardanolformaldehyde prepared in the laboratory. The optimum ratio between phenol and cardanol that gives the maximum bond strength in metal-metal, rubber-rubber and rubber-metal specimens has been identified. Further the ratio between total phenols and formaldehyde is also optimised. The above adhesive system is further modified by the addition of epoxidized phenolic novolacs. For this purpose, phenolic novolac resins are prepared in different stoichiometric ratios and are subsequently epoxidized. The effectiveness of the adhesive for bonding different metal and rubber substrates is another part of the study. To study the ageing behaviour, different bonded specimens are exposed to high temperature, hot water and salt water and adhesive properties have been evaluated. The synthesized resins have been characterized by FTIR , HNMR spectroscopy. The molecular weights of the resins have been obtained by GPC. Thermogravimetric analysis and differential scanning calorimetry are used to study the thermal properties. The fractured surface analysis is studied by scanning electron microscopy. The study has brought to light the influence of phenol/ formaldehyde stoichiometric ratio, addition of cardanol (a renewable resource), adhesion promoters and suitability of the adhesive for different substrates and the age resistance of adhesive joints among other things.
Resumo:
The rapid growth in high data rate communication systems has introduced new high spectral efficient modulation techniques and standards such as LTE-A (long term evolution-advanced) for 4G (4th generation) systems. These techniques have provided a broader bandwidth but introduced high peak-to-average power ratio (PAR) problem at the high power amplifier (HPA) level of the communication system base transceiver station (BTS). To avoid spectral spreading due to high PAR, stringent requirement on linearity is needed which brings the HPA to operate at large back-off power at the expense of power efficiency. Consequently, high power devices are fundamental in HPAs for high linearity and efficiency. Recent development in wide bandgap power devices, in particular AlGaN/GaN HEMT, has offered higher power level with superior linearity-efficiency trade-off in microwaves communication. For cost-effective HPA design to production cycle, rigorous computer aided design (CAD) AlGaN/GaN HEMT models are essential to reflect real response with increasing power level and channel temperature. Therefore, large-size AlGaN/GaN HEMT large-signal electrothermal modeling procedure is proposed. The HEMT structure analysis, characterization, data processing, model extraction and model implementation phases have been covered in this thesis including trapping and self-heating dispersion accounting for nonlinear drain current collapse. The small-signal model is extracted using the 22-element modeling procedure developed in our department. The intrinsic large-signal model is deeply investigated in conjunction with linearity prediction. The accuracy of the nonlinear drain current has been enhanced through several issues such as trapping and self-heating characterization. Also, the HEMT structure thermal profile has been investigated and corresponding thermal resistance has been extracted through thermal simulation and chuck-controlled temperature pulsed I(V) and static DC measurements. Higher-order equivalent thermal model is extracted and implemented in the HEMT large-signal model to accurately estimate instantaneous channel temperature. Moreover, trapping and self-heating transients has been characterized through transient measurements. The obtained time constants are represented by equivalent sub-circuits and integrated in the nonlinear drain current implementation to account for complex communication signals dynamic prediction. The obtained verification of this table-based large-size large-signal electrothermal model implementation has illustrated high accuracy in terms of output power, gain, efficiency and nonlinearity prediction with respect to standard large-signal test signals.
Resumo:
Desde la inauguración del Portal Suba de TransMilenio, han sido evidentes los cambios físico-espaciales en el sector donde fue implementado, materializándose en nuevos proyectos residenciales de alta densidad, centros comerciales, supermercados de grandes superficies, espacios públicos, y vías locales y principales, en un sector que antes de la aparición del portal se caracterizaba principalmente por ser una zona agro-industrial dedicada al cultivo de flores. No obstante, tales intervenciones parecen estar desarticuladas entre si. Por ejemplo, el centro comercial y el supermercado siguen un patrón de construcción cerrada, sin interacción abierta con el espacio público exterior y sin establecer otro tipo de dinámicas urbanas. Por otra parte, puede decirse que gracias a la implementación del portal, la estructura ecológica principal ha sufrido un deterioro importante siendo observable en la disminución significativa de zonas verdes así como en el descuido de los humedales localizados en esta localidad. Por lo tanto, es importante hacer hincapié en la relación sistemas de transporte – desarrollo urbano en tanto que son agentes transformadores del entorno, generadores de desarrollo y bienestar social, y catalizadores de espacios públicos mejor diseñados y más amables, de ser bien planificados y ejecutados, ya que, caso contrario, podrían acarrear efectos contraproducentes en cuanto a la articulación física de la ciudad, accesibilidad, segregación social y el impacto negativo sobre el medio ambiente.
Resumo:
La butirilcolinesterasa humana (BChE; EC 3.1.1.8) es una enzima polimórfica sintetizada en el hígado y en el tejido adiposo, ampliamente distribuida en el organismo y encargada de hidrolizar algunos ésteres de colina como la procaína, ésteres alifáticos como el ácido acetilsalicílico, fármacos como la metilprednisolona, el mivacurium y la succinilcolina y drogas de uso y/o abuso como la heroína y la cocaína. Es codificada por el gen BCHE (OMIM 177400), habiéndose identificado más de 100 variantes, algunas no estudiadas plenamente, además de la forma más frecuente, llamada usual o silvestre. Diferentes polimorfismos del gen BCHE se han relacionado con la síntesis de enzimas con niveles variados de actividad catalítica. Las bases moleculares de algunas de esas variantes genéticas han sido reportadas, entre las que se encuentra las variantes Atípica (A), fluoruro-resistente del tipo 1 y 2 (F-1 y F-2), silente (S), Kalow (K), James (J) y Hammersmith (H). En este estudio, en un grupo de pacientes se aplicó el instrumento validado Lifetime Severity Index for Cocaine Use Disorder (LSI-C) para evaluar la gravedad del consumo de “cocaína” a lo largo de la vida. Además, se determinaron Polimorfismos de Nucleótido Simple (SNPs) en el gen BCHE conocidos como responsables de reacciones adversas en pacientes consumidores de “cocaína” mediante secuenciación del gen y se predijo el efecto delos SNPs sobre la función y la estructura de la proteína, mediante el uso de herramientas bio-informáticas. El instrumento LSI-C ofreció resultados en cuatro dimensiones: consumo a lo largo de la vida, consumo reciente, dependencia psicológica e intento de abandono del consumo. Los estudios de análisis molecular permitieron observar dos SNPs codificantes (cSNPs) no sinónimos en el 27.3% de la muestra, c.293A>G (p.Asp98Gly) y c.1699G>A (p.Ala567Thr), localizados en los exones 2 y 4, que corresponden, desde el punto de vista funcional, a la variante Atípica (A) [dbSNP: rs1799807] y a la variante Kalow (K) [dbSNP: rs1803274] de la enzima BChE, respectivamente. Los estudios de predicción In silico establecieron para el SNP p.Asp98Gly un carácter patogénico, mientras que para el SNP p.Ala567Thr, mostraron un comportamiento neutro. El análisis de los resultados permite proponer la existencia de una relación entre polimorfismos o variantes genéticas responsables de una baja actividad catalítica y/o baja concentración plasmática de la enzima BChE y algunas de las reacciones adversas ocurridas en pacientes consumidores de cocaína.
Resumo:
Las relaciones políticas y económicas entre Corea del Sur y Japón pasaban por su mejor momento en los primeros años del siglo XXI, cuando la disputa territorial por las islas Dokdo, un grupo de islotes ubicados en el mar de Japón y que por décadas han simbolizado el fin de la ocupación del país nipón en territorio coreano, causara nuevas y significativas tensionen entre los dos países. Dicho fenómeno, se sugiere fundamental en la comprensión de las nuevas relaciones bilaterales entre los dos actores y se presenta como foco de análisis en la presente monografía. El documento, presenta un análisis descriptivo de la disputa territorial por las Islas y de sus efectos en las relaciones entre los dos países, tanto en los ámbitos político, social y económico.
Resumo:
Solar irradiance measurements from a new high density urban network in London are presented. Annual averages demonstrate that central London receives 30 ± 10 Wm-2 less solar irradiance than outer London at midday, equivalent to 9 ± 3% less than the London average. Particulate matter and AERONET measurements combined with radiative transfer modeling suggest that the direct aerosol radiative effect could explain 33 to 40% of the inner London deficit and a further 27 to 50% could be explained by increased cloud optical depth due to the aerosol indirect effect. These results have implications for solar power generation and urban energy balance models. A new technique using ‘Langley flux gradients’ to infer aerosol column concentrations over clear periods of three hours has been developed and applied to three case studies. Comparisons with particulate matter measurements across London have been performed and demonstrate that the solar irradiance measurement network is able to detect aerosol distribution across London and transport of a pollution plume out of London.
Resumo:
From 2001, the construction of flats and high-density developments increased in England and the building of houses declined. Does this indicate a change in taste or is it a result of government planning policies? In this paper, an analysis is made of the long-term effects of the policy of constraint which has existed for the past 50 years but the increase in density is identified as occurring primarily after new, revised, planning guidance was issued in England in 2000 which discouraged low-density development. To substantiate this, it is pointed out that the change which occurred in England did not occur in Scotland where guidance was not changed to encourage high-density residential development. The conclusion that the change is the result of planning policies and not of a change in taste is confirmed by surveys of the occupants of new high-rise developments in Leeds. The new flat-dwellers were predominantly young and childless and expressed the intention, in the near future, when they could, of moving out of the city centre and into houses. From recent changes in guidance by the new coalition government, it is expected that the construction of flats in England will fall back to earlier levels over the next few years.
Resumo:
This paper investigates the Mesolithic-Neolithic transition in the Channel Islands. It presents a new synthesis of all known evidence from the islands c. 5000-4300 BC, including several new excavations as well as find spot sites that have not previously been collated. It also summarises – in English – a large body of contemporary material from north-west France. The paper presents a new high-resolution sea level model for the region, shedding light on the formation of the Channel Islands from 9000-4000 BC. Through comparison with contemporary sites in mainland France, an argument is made suggesting that incoming migrants from the mainland and the small indigenous population of the islands were both involved in the transition. It is also argued that, as a result of the fact the Channel Islands witnessed a very different trajectory of change to that seen in Britain and Ireland c. 5000-3500 BC, this small group of islands has a great deal to tell us about the arrival of the Neolithic more widely.
Resumo:
Utilizando-se de uma amostra de movimentações diárias de fundos de investimento em ações, multimercados e renda fixa no Brasil, por meio de uma metodologia baseada na direção das captações líquidas de um grande número de fundos de investimento, agregados em grupos de investidores de acordo com o porte médio de seu investimento (ricos e pobres), foi encontrada forte evidência da ocorrência de efeito manada de forma heterogênea entre diferentes grupos de investidores, sendo que a intensidade do efeito manada varia de acordo com o porte do investidor, tipo de fundo e com a época. Também foi testado um viés de heurística: a ancoragem de preço, que supõe que após uma nova máxima ou mínima histórica nos preços das ações, haverá uma movimentação anormal de investidores, que acreditam ser este evento um indicador sobre os preços futuros. Encontrou-se evidência de que este fenômeno ocorre em diferentes tipos de fundos de investimento, não apenas os fundos de investimento em ações, e que tem maior impacto quando há uma nova mínima do que quando há uma cotação recorde no índice Ibovespa. Entretanto, o poder de explicação deste viés sobre o efeito manada é pequeno, e há uma série de variáveis ainda não exploradas que têm maior poder de explicação sobre o efeito manada. Desta maneira, este estudo encontrou evidências de que os pressupostos de finanças comportamentais de que a informação e as expectativas dos investidores não são homogêneas, e que os investidores são influenciáveis pelas decisões de outros investidores, estão corretos, mas que há fraca evidência que o viés de heurística de ancoragem de preço tenha papel relevante no comportamento dos investidores.
Resumo:
Includes bibliography
Resumo:
Nessa pesquisa discuto os desafios do novo ensino médio na ótica dos professores de ciências, sobretudo os que dizem respeito à proposta de ensino pautado na interdisciplinaridade e na contextualização. Para tanto, esse estudo foi desenvolvido tendo em vista responder como os professores estão lidando no contexto de suas práticas com os novos princípios do ensino médio e que desafios precisam ser enfrentados para a implementação da interdisciplinaridade e da contextualização no ensino, na percepção dos professores de ciências. Para abordar as questões norteadoras e atingir os objetivos propostos neste estudo, lancei mão da pesquisa documental, objetivando a leitura da legislação que deu suporte à refonna do ensino médio, e da pesquisa bibliográfica para abordar os conceitos centrais dessa pesquisa. Além disso, entrevistei professores das disciplinas Química, Física e Biologia através da entrevista estruturada. Os resultados da investigação evidenciaram relativo desconhecimento dos princípios da reforma preconizados nos documentos oficiais, pela maioria dos entrevistados. Entretanto, os professores não se mostraram alheios às discussões relacionadas à interdisciplinaridade e a contextualização, o que não significa que esses sujeitos revelem formas complexas de lidar com esses conceitos ou que implementem ações conscientemente elaboradas, tendo em vista o ensino interdisciplinar e contextualizado. Predomina, nas concepções dos professores, uma visão instrumental da interdisciplinaridade e da contextualização, em consonância com a concepção apresentada nos textos oficiais. Quanto aos desafios percebidos pelos professores para a implementação da interdisciplinaridade e da contextualização no ensino, estes ficaram circunscritos à sua dimensão contextual, isto é, os problemas estruturais da escola pública e os limites de espaço e tempo. A dimensão conceitual dos termos em questão, bem como os limites advindos da formação dos educadores, foram perifericamente problematizados.