944 resultados para end-to-end testing, javascript, application web, single-page application
Resumo:
No ambiente empresarial actual, cada vez mais competitivo e exigente, é um factor fundamental para o sucesso das empresas a sua capacidade de atingir e melhorar os níveis de satisfação exigidos pelos clientes. Para identificar as melhorias a implementar, as empresas devem ser capazes de monitorizar e controlar todas as suas actividades e processos. O acompanhamento realizado às actividades delegadas a empresas externas, como por exemplo o transporte de mercadorias, é dificultado quando os prestadores destes serviços não possuem ferramentas de apoio que disponibilizem informação necessária para o efeito. A necessidade de colmatar esta dificuldade na recolha da informação durante a distribuição de uma encomenda na empresa Caetano Parts, uma empresa de revenda de peças de substituição automóvel, levou ao desenvolvimento de uma ferramenta que permite fazer o seguimento de uma encomenda em todas as suas fases, permitindo ao responsável pelas operações acompanhar o estado da encomenda desde o instante em que a encomenda é colocada, passando pelo seu processamento dentro das instalações, até à sua entrega ao cliente. O sistema desenvolvido é composto por dois componentes, o front-end e o back-end. O front-end é composto por uma aplicação web, e por uma aplicação Android para dispositivos móveis. A aplicação web disponibiliza a gestão da base de dados, o acompanhamento do estado da encomenda e a análise das operações. A aplicação Android é disponibilizada às empresas responsáveis pelo transporte das encomendas e possibilita a actualização online da informação acerca do processo de entrega. O back-end é composto pela unidade de armazenamento e processamento da informação e encontra-se alojado num servidor com ligação à internet, disponibilizando uma interface com o serviço móvel do tipo serviço web. A concepção, desenvolvimento e descrição das funcionalidades desta ferramenta são abordadas ao longo do trabalho. Os testes realizados ao longo do desenvolvimento validaram o correcto funcionamento da ferramenta, estando pronta para a realização de um teste piloto.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Avui en dia la venda de productes, mitjançant les possibilitats que ens ofereix Internet, es troba en ple creixement. Aquest projecte pretén posar en funcionament una pàgina Web dedicada a la venda de fruita, concretament kiwis. Des de fa un temps, la població comença a ser conscient del desequilibri entre l'agent productor i l'agent comercial. Com passa també en altres sectors, el productor ven a un preu molt inferior respecte al que després es ficarà de cara al comprador final. En el cas de la fruita, el client acaba comprant un producte més car i normalment de menys qualitat. L'objectiu principal d'aquest projecte és promoure la venda online a partir d'una mercaderia de qualitat i més econòmica, aconseguint un major benefici tant per part del productor com del client.
Resumo:
JMForm és una aplicació web amb un espai per la creació dels informes que els usuaris podran omplir, i un altre on es podran veure els resultats en forma tabular i gràfica. D'aquesta manera es facilita el treball en grup, tant de les persones que administraran els formularis, com dels usuaris. Aquesta aplicació serà un component que es podrà utilitzar en un entorn Joomla! 1.5.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.
Resumo:
Background: The wish to die has mainly been studied in terminally- ill young adults. In elderly persons, factors associated with the wish to die are likely to differ from those observed in younger people. Since the most frequently used scale -"The Schedule ofAttitudes Toward Hastened Death" (SAHD, Rosenfeld et al., 2000)- was previously used in terminally ill cancer or AIDS patients, its use in elderly people suffering from multiple comorbidities is problematic. The objectives of this study were 1) to adapt the SAHD for use in elderly people, 2) to develop a new instrument to assess patients' attitudes towards death 3) to test the relevance/acceptability of these instruments. Methods:An adapted version of the SAHD to the elderly population (SAHD-OLD) was obtained by analyzing all items of the instrument instrument in an interdisciplinary group of experts in geriatric care. Items were modified according to their relevance in elderly population. An instrument to assess patients' attitudes towards death was built on previous qualitative work performed by Schroepfer. These 2 instruments were subjected to cognitive testing in a convenience sample of 11 community-dwelling people (median age = 82 years; range 76-91). Results: The SAHD-OLD was obtained by modifying those items addressing palliative care issues (eg. irreversible consequences of stopping treatment) and systematically replacing "illness/disease" by "health problems". We expressed in statements the 6 categories identified by Schroepfer, and created instructions asking respondents to describe their current attitude towards death (Adapted Schroepfer). During cognitive testing, our sample assessed the SAHD-OLD and the Adapted Schroepfer as relevant for elderly people. Respondents judged these 2 instruments acceptable and appreciated the direct manner in which they addressed end of life issues. The opportunity to speak openly on this topic was welcomed. Conclusions: The SAHD-OLD and the Adapted Schroepfer seem promising instruments to assess the wish to die in elderly people suffering from multiple comorbidities. Preliminary results show good comprehension, high relevance and acceptability. Psychometric properties of the SAHD-OLD are currently being tested in a large sample of patients.
Resumo:
Summary Cell therapy has emerged as a strategy for the treatment of various human diseases. Cells can be transplanted considering their morphological and functional properties to restore a tissue damage, as represented by blood transfusion, bone marrow or pancreatic islet cells transplantation. With the advent of the gene therapy, cells also were used as biological supports for the production of therapeutic molecules that can act either locally or at distance. This strategy represents the basis of ex vivo gene therapy characterized by the removal of cells from an organism, their genetic modification and their implantation into the same or another individual in a physiologically suitable location. The tissue or biological function damage dictates the type of cells chosen for implantation and the required function of the implanted cells. The general aim of this work was to develop an ex vivo gene therapy approach for the secretion of erythropoietin (Epo) in patients suffering from Epo-responsive anemia, thus extending to humans, studies previously performed with mouse cells transplanted in mice and rats. Considering the potential clinical application, allogeneic primary human cells were chosen for practical and safety reasons. In contrast to autologous cells, the use of allogeneic cells allows to characterize a cell lineage that can be further transplanted in many individuals. Furthermore allogeneic cells avoid the potential risk of zoonosis encountered with xenogeneic cells. Accordingly, the immune reaction against this allogeneic source was prevented by cell macro- encapsulation that prevents cell-to-cell contact with the host immune system and allows to easy retrieve the implanted device. The first step consisted in testing the survival of various human primary cells that were encapsulated and implanted for one month in the subcutaneous tissue of immunocompetent and naturally or therapeutically immunodepressed mice, assuming that xenogeneic applications constitute a stringent and representative screening before human transplantation. A fibroblast lineage from the foreskin of a young donor, DARC 3.1 cells, showed the highest mean survival score. We have then performed studies to optimize the manufacturing procedures of the encapsulation device for successful engraftment. The development of calcifications on the polyvinyl alcohol (PVA) matrix serving as a scaffold for enclosed cells into the hollow fiber devices was reported after one month in vivo. Various parameters, including matrix rinsing solutions, batches of PVA and cell lineages were assessed for their respective role in the development of the phenomenon. We observed that the calcifications could be totally prevented by using ultra-pure sterile water instead of phosphate buffer saline solution in the rinsing procedure of the PVA matrix. Moreover, a higher lactate dehydrogenase activity of the cells was found to decrease calcium depositions due to more acidic microenvironment, inhibiting the calcium precipitation. After the selection of the appropriate cell lineage and the optimization of encapsulation conditions, a retroviral-based approach was applied to DARC 3.1 fibroblasts for the transduction of the human Epo cDNA. Various modifications of the retroviral vector and the infection conditions were performed to obtain clinically relevant levels of human Epo. The insertion of a post-transcriptional regulatory element from the woodchuck hepatitis virus as well as of a Kozak consensus sequence led to a 7.5-fold increase in transgene expression. Human Epo production was further optimized by increasing the multiplicity of infection and by selecting high producer cells allowing to reach 200 IU hEpo/10E6 cells /day. These modified cells were encapsulated and implanted in vivo in the same conditions as previously described. All the mouse strains showed a sustained increase in their hematocrit and a high proportion of viable cells were observed after retrieval of the capsules. Finally, in the perspective of human application, a syngeneic model using encapsulated murine myoblasts transplanted in mice was realized to investigate the roles of both the host immune response and the cells metabolic requirements. Various loading densities and anti-inflammatory as well as immunosuppressive drugs were studied. The results showed that an immune process is responsible of cell death in capsules loaded at high cell density. A supporting matrix of PVA was shown to limit the cell density and to avoid early metabolic cell death, preventing therefore the immune reaction. This study has led to the development of encapsulated cells of human origin producing clinically relevant amounts of human EPO. This work resulted also to the optimization of cell encapsulation technical parameters allowing to begin a clinical application in end-stage renal failure patients. Résumé La thérapie cellulaire s'est imposée comme une stratégie de traitement potentiel pour diverses maladies. Si l'on considère leur morphologie et leur fonction, les cellules peuvent être transplantées dans le but de remplacer une perte tissulaire comme c'est le cas pour les transfusions sanguines ou les greffes de moelle osseuse ou de cellules pancréatiques. Avec le développement de la thérapie génique, les cellules sont également devenues des supports biologiques pour la production de molécules thérapeutiques. Cette stratégie représente le fondement de la thérapie génique ex vivo, caractérisée par le prélèvement de cellules d'un organisme, leur modification génétique et leur implantation dans le même individu ou dans un autre organisme. Le choix du type de cellule et la fonction qu'elle doit remplir pour un traitement spécifique dépend du tissu ou de la fonction biologique atteintes. Le but général de ce travail est de développer .une approche par thérapie génique ex vivo de sécrétion d'érythropoïétine (Epo) chez des patients souffrant d'anémie, prolongeant ainsi des travaux réalisés avec des cellules murines implantées chez des souris et des rats. Dans cette perpective, notre choix s'est porté sur des cellules humaines primaires allogéniques. En effet, contrairement aux cellules autologues, une caractérisation unique de cellules allogéniques peut déboucher sur de nombreuses applications. Par ailleurs, l'emploi de cellules allogéniques permet d'éviter les riques de zoonose que l'on peut rencontrer avec des cellules xénogéniques. Afin de protéger les cellules allogéniques soumises à une réaction immunitaire, leur confinement dans des macro-capsules cylindriques avant leur implantation permet d'éviter leur contact avec les cellules immunitaires de l'hôte, et de les retrouver sans difficulté en cas d'intolérance ou d'effet secondaire. Dans un premier temps, nous avons évalué la survie de différentes lignées cellulaires humaines primaires, une fois encapsulées et implantées dans le tissu sous-cutané de souris, soit immunocompétentes, soit immunodéprimées naturellement ou par l'intermédiaire d'un immunosuppresseur. Ce modèle in vivo correspond à des conditions xénogéniques et représente par conséquent un environnement de loin plus hostile pour les cellules qu'une transplantation allogénique. Une lignée fibroblastique issue du prépuce d'un jeune enfant, nommée DARC 3 .1, a montré une remarquable résistance avec un score de survie moyen le plus élevé parmi les lignées testées. Par la suite, nous nous sommes intéressés aux paramètres intervenant dans la réalisation du système d'implantation afin d'optimaliser les conditions pour une meilleure adaptation des cellules à ce nouvel environnement. En effet, en raison de l'apparition, après un mois in vivo, de calcifications au niveau de la matrice de polyvinyl alcohol (PVA) servant de support aux cellules encapsulées, différents paramètres ont été étudiés, tels que les procédures de fabrication, les lots de PVA ou encore les lignées cellulaires encapsulées, afin de mettre en évidence leur rôle respectif dans la survenue de ce processus. Nous avons montré que l'apparition des calcifications peut être totalement prévenue par l'utilisation d'eau pure au lieu de tampon phosphaté lors du rinçage des matrices de PVA. De plus, nous avons observe qu'un taux de lactate déshydrogénase cellulaire élevé était corrélé avec une diminution des dépôts de calcium au sein de la matrice en raison d'un micro-environnement plus acide inhibant la précipitation du calcium. Après sélection de la lignée cellulaire appropriée et de l'optimisation des conditions d'encapsulation, une modification génétique des fibroblastes DARC 3.1 a été réalisée par une approche rétrovirale, permettant l'insertion de l'ADN du gène de l'Epo dans le génome cellulaire. Diverses modifications, tant au niveau génétique qu'au niveau des conditions d'infection, ont été entreprises afin d'obtenir des taux de sécrétion d'Epo cliniquement appropriés. L'insertion dans la séquence d'ADN d'un élément de régulation post¬transcriptionnelle dérivé du virus de l'hépatite du rongeur (« woodchuck ») ainsi que d'une séquence consensus appelée « Kozak » ont abouti à une augmentation de sécrétion d'Epo 7.5 fois plus importante. De même, l'optimisation de la multiplicité d'infection et la sélection plus drastique des cellules hautement productrices ont permis finalement d'obtenir une sécrétion correspondant à 200 IU d'Epo/10E6 cells/jour. Ces cellules génétiquement modifiées ont été encapsulées et implantées in vivo dans les mêmes conditions que celles décrites plus haut. Toutes les souris transplantées ont montré une augmentation significative de leur hématocrite et une proportion importante de cellules présentait une survie conservée au moment de l'explantation des capsules. Finalement, dans la perspective d'une application humaine, un modèle syngénique a été proposé, basé sur l'implantation de myoblastes murins encapsulés dans des souris, afin d'investiguer les rôles respectifs de la réponse immunitaire du receveur et des besoins métaboliques cellulaires sur leur survie à long terme. Les cellules ont été encapsulées à différentes densités et les animaux transplantés se sont vus administrer des injections de molécules anti-inflammatoires ou immunosuppressives. Les résultats ont démontré qu'une réaction immunologique péri-capsulaire était à la base du rejet cellulaire dans le cas de capsules à haute densité cellulaire. Une matrice de PVA peut limiter cette densité et éviter une mort cellulaire précoce due à une insuffisance métabolique et par conséquent prévenir la réaction immunitaire. Ce travail a permis le développement de cellules encapsulées d'origine humaine sécrétant des taux d'Epo humaine adaptés à des traitements cliniques. De pair avec l'optimalisation des paramètres d'encapsulation, ces résultats ont abouti à l'initiation d'une application clinique destinée à des patients en insuffisance rénale terminale.
Resumo:
WebGraphEd is an open source software for graph visualization and manipulation. It is especially designed to work for the web platform through a web browser. The web application has been written in JavaScript and compacted later, which makes it a very lightweight software. There is no need of additional software, and the only requirement is to have an HTML5 compliant browser. WebGraphEd works with scalable vector graphics (SVG), which it makes possible to create lossless graph drawings.
Resumo:
Työssä oli tavoitteena suunnitella globaali sovellusarkkitehtuuri, joka ohjaa teollisen huoltoyrityksen sovellusten kehitystyötä. Sovellusarkkitehtuuri kuvaa tietokoneohjelmien toiminnallisuuteen loppukäyttäjien näkökulmasta ja sen laatiminen on osa strategista tietojärjestelmäsuunnittelua. Arkkitehtuurin tehtävänä on varmistaa, että tietojärjestelmät suunnitellaan kokonaisuutena tukemaan organisaation toimintaa. Arkkitehtuurin tekemistä ohjasi strategisen tietojärjestelmäsuunnittelun periatteet ja mallit. Tekniikat olivat samoja kuin projektikohtaisessa tietojärjestelmäsuunnittelussa. Sovellusarkkitehtuurin tekeminen alkoi tutustumalla yrityksessä vallitsevaan tilanteeseen sekä liiketoiminta- ja tietotekniikkastrategioihin. Tarkastelun kohteena olivat pääasiassa liiketoimintaprosessit ja käytössä olevat sovellukset. Tutustuminen tapahtui lähinnä haastatteluin ja dokumentteihin tutustumalla. Seuraavaksi johdettiin vaatimukset tulevaisuuden sovelluksille haastatteluista ja edellisen vaiheen materiaalin perusteella. Liiketoiminnan kannalta tärkeimmät vaatimukset valittiin täytettäväksi arkkitehtuurilla. Varsinaisen arkitehtuurin tekeminen oli lähinnä sovellusten valitsemisesta ja niiden keskinäisten suhteiden määrittelyä. Arkkitehtuurin perusteella määritettiin kehityshankeet.
Resumo:
Monet ohjelmistoyritykset ovat alkaneet kiinnittää yhä enemmän huomiota ohjelmistotuotteidensa laatuun. Tämä on johtanut siihen, että useimmat niistä ovat valinneet ohjelmistotestauksen välineeksi, jolla tätä laatua voidaan parantaa. Testausta ei pidä rajoittaa ainoastaan ohjelmistotuotteeseen itseensä, vaan sen tulisi kattaa koko ohjelmiston kehitysprosessi. Validaatiotestauksessa keskitytään varmistamaan, että lopputuote täyttää sille asetetut vaatimukset, kun taas verifikaatiotestausta käytetään ennaltaehkäisevänä testauksena, jolla pyritään poistamaan virheitä jo ennenkuin ne pääsevät lähdekoodiin asti. Työ, johon tämä diplomityö perustuu, tehtiin alkukevään ja kesän aikana vuonna 2003 Necsom Oy:n toimeksiannosta. Necsom on pieni suomalainen ohjelmistoyritys, jonka tutkimus- ja kehitysyksikkö toimii Lappeenrannassa.Tässä diplomityössä tutustutaan aluksi ohjelmistotestaukseen sekä eri tapoihin sen organisoimiseksi. Tämän lisäksi annetaan yleisiä ohjeita testisuunnitelmien ja testaustapausten tekoon, joita onnistunut ja tehokas testaus edellyttää. Kun tämä teoria on käyty läpi, esitetään esimerkkinä kuinka sisäinen ohjelmistotestaus toteutettiin Necsomilla. Lopuksi esitetään johtopäätökset, joihin päädyttiin käytännön testausprosessin seuraamisen jälkeen ja annetaan jatkotoimenpide-ehdotuksia.
Resumo:
Taking the maximum advantage of technological innovations and the investment in them is of key importance for businesses. The IT industry offers a wide range of innovative high-technology solutions to manage information processing and distribution. However for end-user businesses to make informed decisions in this area is challenging. The aim of this research is to identify the key differences in principal solutions, and what the selection criteria should be for those involved. Existing methodologies for software development are classified, and some key criteria are described to help IT system developers and users determine what are the most important factors in system selection, development and deployment. Statistical data is researched and analysed, a theoretical basis is developed and reviewed, key issues from case studies are identified and generalized to be presented along with the conclusions in the current study. The results give a good basis for corporate consideration and provide overall support to the key decisions in developing web-based software. The conclusion is that new web developments should be considered the stakeholders as an evolution of existing business systems, but they should then pay particular attention to the new advantages that web-based software offers in terms of standardised interfaces and procedures, universal deployment opportunities, and a range of other benefits the study highlights.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
Laser cutting implementation possibilities into paper making machine was studied as the main objective of the work. Laser cutting technology application was considered as a replacement tool for conventional cutting methods used in paper making machines for longitudinal cutting such as edge trimming at different paper making process and tambour roll slitting. Laser cutting of paper was tested in 70’s for the first time. Since then, laser cutting and processing has been applied for paper materials with different level of success in industry. Laser cutting can be employed for longitudinal cutting of paper web in machine direction. The most common conventional cutting methods include water jet cutting and rotating slitting blades applied in paper making machines. Cutting with CO2 laser fulfils basic requirements for cutting quality, applicability to material and cutting speeds in all locations where longitudinal cutting is needed. Literature review provided description of advantages, disadvantages and challenges of laser technology when it was applied for cutting of paper material with particular attention to cutting of moving paper web. Based on studied laser cutting capabilities and problem definition of conventional cutting technologies, preliminary selection of the most promising application area was carried out. Laser cutting (trimming) of paper web edges in wet end was estimated to be the most promising area where it can be implemented. This assumption was made on the basis of rate of web breaks occurrence. It was found that up to 64 % of total number of web breaks occurred in wet end, particularly in location of so called open draws where paper web was transferred unsupported by wire or felt. Distribution of web breaks in machine cross direction revealed that defects of paper web edge was the main reason of tearing initiation and consequent web break. The assumption was made that laser cutting was capable of improvement of laser cut edge tensile strength due to high cutting quality and sealing effect of the edge after laser cutting. Studies of laser ablation of cellulose supported this claim. Linear energy needed for cutting was calculated with regard to paper web properties in intended laser cutting location. Calculated linear cutting energy was verified with series of laser cutting. Practically obtained laser energy needed for cutting deviated from calculated values. This could be explained by difference in heat transfer via radiation in laser cutting and different absorption characteristics of dry and moist paper material. Laser cut samples (both dry and moist (dry matter content about 25-40%)) were tested for strength properties. It was shown that tensile strength and strain break of laser cut samples are similar to corresponding values of non-laser cut samples. Chosen method, however, did not address tensile strength of laser cut edge in particular. Thus, the assumption of improving strength properties with laser cutting was not fully proved. Laser cutting effect on possible pollution of mill broke (recycling of trimmed edge) was carried out. Laser cut samples (both dry and moist) were tested on the content of dirt particles. The tests revealed that accumulation of dust particles on the surface of moist samples can take place. This has to be taken into account to prevent contamination of pulp suspension when trim waste is recycled. Material loss due to evaporation during laser cutting and amount of solid residues after cutting were evaluated. Edge trimming with laser would result in 0.25 kg/h of solid residues and 2.5 kg/h of lost material due to evaporation. Schemes of laser cutting implementation and needed laser equipment were discussed. Generally, laser cutting system would require two laser sources (one laser source for each cutting zone), set of beam transfer and focusing optics and cutting heads. In order to increase reliability of system, it was suggested that each laser source would have double capacity. That would allow to perform cutting employing one laser source working at full capacity for both cutting zones. Laser technology is in required level at the moment and do not require additional development. Moreover, capacity of speed increase is high due to availability high power laser sources what can support the tendency of speed increase of paper making machines. Laser cutting system would require special roll to maintain cutting. The scheme of such roll was proposed as well as roll integration into paper making machine. Laser cutting can be done in location of central roll in press section, before so-called open draw where many web breaks occur, where it has potential to improve runability of a paper making machine. Economic performance of laser cutting was done as comparison of laser cutting system and water jet cutting working in the same conditions. It was revealed that laser cutting would still be about two times more expensive compared to water jet cutting. This is mainly due to high investment cost of laser equipment and poor energy efficiency of CO2 lasers. Another factor is that laser cutting causes material loss due to evaporation whereas water jet cutting almost does not cause material loss. Despite difficulties of laser cutting implementation in paper making machine, its implementation can be beneficial. The crucial role in that is possibility to improve cut edge strength properties and consequently reduce number of web breaks. Capacity of laser cutting to maintain cutting speeds which exceed current speeds of paper making machines what is another argument to consider laser cutting technology in design of new high speed paper making machines.
Resumo:
"Mémoire présenté à la Faculté des études supérieures En vue de l'obtention du grade de maîtrise en droit"
Resumo:
This thesis Entitled Marine actinomycetes as source of antimicrobial compounds and as probiotics and single cell protein for application in penaeid peawn culture systems. Ocean harbours more than 80% of all life on earth and remains our greatest untapped natural resource. The study revealed the potential of marine actinomycetes as a source of antimicrobial compounds. The selected streptomycetes were found to be capable of inhibiting most of the pathogenic vibrios, whichis a major problem both in hatcheries and grow out systems. The bioactive principle can be incorporated with commercial feeds and applied as medicated diet for the control of vibrios in culture systems.The hydrolytic potential inhibitory property against pathogens and non—pathogenicity to penaeid prawns make the selected Streptomycesspp.an effective probioic in aquaculture. Since there is considerably less inhibition to the natural in pond ecosystem the microbial diversityis being maintained and thereby the water quality. Actinomycetes was found to be a good source of single cell protein as an ingredient inaquaculture feed formulations. Large amount of mycelial waste (actinomycete biomassO is produced from antibiotic industries and this nutrient rich waste can be effectively used as a protein source in aquaculture feeds.This study reveals the importance of marine actinomycetes as a source of antimicrobial compounds and as a probiotic and single cell protein for aquaculture applications.