985 resultados para ADVANTAGES
Resumo:
There were two main objectives in this thesis investigation, first, the production, characterisation, in vitro degradation and release studies of double walled microspheres for drug release control. The second one, and the most challenging, was the production of double walled nanospheres, also for drug control delivery. The spheres were produced using two polymers, the Poly(L-lactide)Acid, PLLA, and the Poly(L-lactide-co-glycolic)Acid, PLGA.Afterwards, a model drug, Meloxicam, which is an antiinflammatory drug, was encapsulated into the particles. Micro and nanospheres were produced by the solvent extraction/evaporation method, where perfect spherical particles were obtained. By varying the polymers PLLA/PLGA mass ratio, different core and shell composition, as well as several shell and core thickness were observed. In the particles with a PLLA/PLGA mass ratio 1:1, the shell is composed by PLLA and the core by PLGA. It was also verified that the Meloxicam has a tendency to be distributed in the PLGA layer. Micro and nanoparticles were characterised in morphology, size, polymer cristalinity properties and drug distribution. Particles degradation studies was performed, where the particles in a PVA solution of pH 7,4 where placed in an incubator, during approximately 40 days, at 120rpm, and 37ºC, simulating, as much as possible, the human body environment. From these studies, the conclusion was that particles containing a PLGA shell and a PLLA core degrade more rapidly, due to the fact that PLLA is more hydrophobic than the PLGA. Concerning the drug release controlled results, done also for 40 and 50 days, they showed that the microspheres containing a shell of PLLA release more slowly than when the shell is composed of PLGA. This result was predictable, since the drug is solubilised in the PLGA polymer and so, in that case, the PLLA shell works like a barrier between the drug and the outer medium. Another positive aspect presented by this study is the lower initial burst effect, obtained when using double walled particles, which is one of the advantages of the same. In a second part of this investigation, the production of the nanospheres was the main goal, since it was not yet accomplished by other authors or investigators. After several studies, referring to the speed, time and type of agitation, as well as, the concentration and volume of the first aqueous solution of poly-vinyl-alcohol (PVA) during the process of solvent extraction/evaporation it was possible to obtain double walled nanospheres.(...)
Resumo:
A ready-mixed and several laboratory formulated mortars were produced and tested in fresh state and after hardening, simulating a masonry plaster for indoor application. All the mortars used a clayish earth from the same region and different compositions of aggregates, eventually including fibres and a phase change material. All the formulated mortars were composed by 1:3 volumetric proportions of earth and aggregate. Tests were developed for consistency, fresh bulk density, thermal conductivity, capillary absorption and drying, water vapour permeability and sorption-desorption. The use of PCM changed drastically the workability of the mortars and increased their capillary absorption. The use of fibres and variations on particle size distribution of the mixtures of sand that were used had no significant influence on tested properties. But particularly the good workability of these mortars and the high capacity of sorption and desorption was highlighted. With this capacity plasters made with these mortars are able to adsorb water vapour from indoor atmosphere when high levels of relative humidity exist and release water vapour when the indoor atmosphere became too dry. This fact makes them able to contribute passively for a healthier indoor environment. The technical, ecological and environmental advantages of the application of plasters with this type of mortars are emphasized, with the aim of contributing for an increased use for new or existent housing.
Resumo:
Due to their exposure to environmental conditions, outer coatings composed by render and painting system are usually the first construction elements to deteriorate and require intervention. A correct conservation and rehabilitation of these materials is fundamental once they provide protection to other façade materials. It is known that old mortar renders were essentially air lime based mortars. To maintain the integrity of the whole wall-render elements, the image of the building and to avoid accelerated degradation, conservation and rehabilitation must be implemented with compatible mortars. As that, lime based mortars would be preferable. It was also common, in ancient renders, the incorporation of ceramic residues, which is, nowadays, an abundant material, especially in Central Region of Portugal. The reuse of these materials has great relevance once their landfilling causes serious environmental issues. In an attempt to combine the environmental and technical advantages of the use of ceramic waste in mortars’ production for rehabilitation purposes, a research has been developed at the University of Coimbra, in cooperation with Nova University of Lisbon, on the long term behaviour of air lime mortars with ceramic residues. In this paper the most significant up to one year results of an experimental campaign with air lime mortars with 1:3 and 1:2 volumetric proportions and ceramic residues are presented.
Resumo:
Conventionally the problem of the best path in a network refers to the shortest path problem. However, for the vast majority of networks present nowadays this solution has some limitations which directly affect their proper functioning, as well as an inefficient use of their potentialities. Problems at the level of large networks where graphs of high complexity are commonly present as well as the appearing of new services and their respective requirements, are intrinsically related to the inability of this solution. In order to overcome the needs present in these networks, a new approach to the problem of the best path must be explored. One solution that has aroused more interest in the scientific community considers the use of multiple paths between two network nodes, where they can all now be considered as the best path between those nodes. Therefore, the routing will be discontinued only by minimizing one metric, where only one path between nodes is chosen, and shall be made by the selection of one of many paths, thereby allowing the use of a greater diversity of the present paths (obviously, if the network consents). The establishment of multi-path routing in a given network has several advantages for its operation. Its use may well improve the distribution of network traffic, improve recovery time to failure, or it can still offer a greater control of the network by its administrator. These factors still have greater relevance when networks have large dimensions, as well as when their constitution is of high complexity, such as the Internet, where multiple networks managed by different entities are interconnected. A large part of the growing need to use multipath protocols is associated to the routing made based on policies. Therefore, paths with different characteristics can be considered with equal level of preference, and thus be part of the solution for the best way problem. To perform multi-path routing using protocols based only on the destination address has some limitations but it is possible. Concepts of graph theory of algebraic structures can be used to describe how the routes are calculated and classified, enabling to model the routing problem. This thesis studies and analyzes multi-path routing protocols from the known literature and derives a new algebraic condition which allows the correct operation of these protocols without any network restriction. It also develops a range of software tools that allows the planning and the respective verification/validation of new protocols models according to the study made.
Resumo:
RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.
Resumo:
Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.
Resumo:
In the last few years, we have observed an exponential increasing of the information systems, and parking information is one more example of them. The needs of obtaining reliable and updated information of parking slots availability are very important in the goal of traffic reduction. Also parking slot prediction is a new topic that has already started to be applied. San Francisco in America and Santander in Spain are examples of such projects carried out to obtain this kind of information. The aim of this thesis is the study and evaluation of methodologies for parking slot prediction and the integration in a web application, where all kind of users will be able to know the current parking status and also future status according to parking model predictions. The source of the data is ancillary in this work but it needs to be understood anyway to understand the parking behaviour. Actually, there are many modelling techniques used for this purpose such as time series analysis, decision trees, neural networks and clustering. In this work, the author explains the best techniques at this work, analyzes the result and points out the advantages and disadvantages of each one. The model will learn the periodic and seasonal patterns of the parking status behaviour, and with this knowledge it can predict future status values given a date. The data used comes from the Smart Park Ontinyent and it is about parking occupancy status together with timestamps and it is stored in a database. After data acquisition, data analysis and pre-processing was needed for model implementations. The first test done was with the boosting ensemble classifier, employed over a set of decision trees, created with C5.0 algorithm from a set of training samples, to assign a prediction value to each object. In addition to the predictions, this work has got measurements error that indicates the reliability of the outcome predictions being correct. The second test was done using the function fitting seasonal exponential smoothing tbats model. Finally as the last test, it has been tried a model that is actually a combination of the previous two models, just to see the result of this combination. The results were quite good for all of them, having error averages of 6.2, 6.6 and 5.4 in vacancies predictions for the three models respectively. This means from a parking of 47 places a 10% average error in parking slot predictions. This result could be even better with longer data available. In order to make this kind of information visible and reachable from everyone having a device with internet connection, a web application was made for this purpose. Beside the data displaying, this application also offers different functions to improve the task of searching for parking. The new functions, apart from parking prediction, were: - Park distances from user location. It provides all the distances to user current location to the different parks in the city. - Geocoding. The service for matching a literal description or an address to a concrete location. - Geolocation. The service for positioning the user. - Parking list panel. This is not a service neither a function, is just a better visualization and better handling of the information.
Resumo:
With the recent advances in technology and miniaturization of devices such as GPS or IMU, Unmanned Aerial Vehicles became a feasible platform for a Remote Sensing applications. The use of UAVs compared to the conventional aerial platforms provides a set of advantages such as higher spatial resolution of the derived products. UAV - based imagery obtained by a user grade cameras introduces a set of problems which have to be solved, e. g. rotational or angular differences or unknown or insufficiently precise IO and EO camera parameters. In this work, UAV - based imagery of RGB and CIR type was processed using two different workflows based on PhotoScan and VisualSfM software solutions resulting in the DSM and orthophoto products. Feature detection and matching parameters influence on the result quality as well as a processing time was examined and the optimal parameter setup was presented. Products of the both workflows were compared in terms of a quality and a spatial accuracy. Both workflows were compared by presenting the processing times and quality of the results. Finally, the obtained products were used in order to demonstrate vegetation classification. Contribution of the IHS transformations was examined with respect to the classification accuracy.
Resumo:
New emerging contaminants could represent a danger to the environment and Humanity with repercussions not yet known. One of the major worldwide pharmaceutical and personal care productions are antimicrobials products, triclosan, is an antimicrobial agent present in most products. Despite the high removal rate of triclosan present in wastewater treatments, triclosan levels are on the rise in the environment through disposal of wastewater effluent and use of sewage sludge in land application. Regulated in the EC/1272/2008 (annex VI, table 3.1), this compound is considered very toxic to aquatic life and it has been reported that photochemical transformation of triclosan produces dioxins. In the current work it was defined three objectives; determination of the most efficient process in triclosan degradation, recurring to photochemical degradation methods comparing different sources of light; identification of the main by-products formed during the degradation and the study of the influence of the Fenton and photo-Fenton reaction. Photochemical degradation methods such as: photocatalysis under florescent light (UV), photocatalysis under visible light (sunlight), photocatalysis under LEDs, photo-Fenton and Fenton reaction have been compared in this work. The degradation of triclosan was visualized through gas chromatography/mass spectrometry (GC/MS). In this study photo-Fenton reaction has successfully oxidized triclosan to H2O and CO2 without any by-products within 2 hours. Photocatalysis by titanium dioxide (TiO2) under LEDs was possible, having a degradation rate of 53% in an 8 hours essay. The degradation rate of the Fenton reaction, UV light and sunlight showed degradation between 90% and 95%. The results are reported to the data observed without statistic support, since this was not possible during the work period. Hydroquinone specie and 2,4-dichlorophenol by-products were identified in the first hour of photocatalysis by UV. A common compound, possibly identified has C7O4H , was present at the degradation by UV, sunlight and LEDs and was concluded to be a contaminant. In the future more studies in the use of LEDs should be undertaken given the advantages of long durability and low consumption of energy of these lamps and that due to their negative impact on the environment fluorescent lamps are being progressively made unavailable by governments, requiring new solutions to be found. Fenton and photo-Fenton reactions can also be costly processes given the expensive reagents used.
Resumo:
The potential of human adenovirus vectors as vehicles for gene transfer with clinical applications in vaccination, cancer treatment and in many monogenic and acquired diseases has been demonstrated in several studies and clinical trials. However, the clinical use of these vectors can be limited by pre-existing humoral and cellular anti-capsid immunity. One way to circumvent this bottleneck while keeping the advantages of using adenovirus vectors is using non-human viruses such as Canine Adenovirus type 2 (CAV-2). Moreover, CAV-2 vectors present attractive features to develop potential treatment of neurodegenerative and ocular disorders. While the interest in CAV-2 vectors increases, scalable and robust production processes are required to meet the need for preclinical and possibly clinical uses.(...)
Resumo:
In this work project we discuss the advantages and disadvantages of social media as a marketing tool. Four international cases were analyzed to provide anecdotal evidence of how social and viral marketing have been used by four firms in very different industries. We reviewed empirical evidence on the topic to discuss the main components of viral marketing. We concluded that positive (electronic) word of mouth, short response time and seeding through high network value customers are the main drivers of the success of a viral marketing campaign. We also conducted a study of the Portuguese telecommunications industry, in particular, the mobile segment. We found that the three main players operating in this market have been using social media successfully as a marketing tool in a strategic approach to the 14-25 years old segment.
Resumo:
This thesis focus on the measurement and accounting of contributions received by nonprofit organizations, as they are a significant component of revenues nowadays. A survey was developed and forward to 38 different NPOs, with the goal of understanding their motivations and what advantages and disadvantages they believe would result if they start to measure and account for all kinds of contributions. They presented many advantages from this practice; however, some are not doing it due to the difficulties in valuing contributions with no market value which would require a higher workload, waste of resources and time to be taken from other important activities.
Resumo:
Polymeric nanoparticles (PNPs) have attracted considerable interest over the last few years due to their unique properties and behaviors provided by their small size. Such materials could be used in a wide range of applications such as diagnostics and drug delivery. Advantages of PNPs include controlled release, protection of drug molecules and its specific targeting, with concomitant increasing of the therapeutic index. In this work, novel sucrose and cholic acid based PNPs were prepared from different polymers, namely polyethylene glycol (PEG), poly(D,L-lactic-co-glycolic acid) (PLGA) and PLGA-co-PEG copolymer. In these PNP carriers, cholic acid will act as a drug incorporation site and the carbohydrate as targeting moiety. The uptake of nanoparticles into cells usually involves endocytotic processes, which depend primarily on their size and surface characteristics. These properties can be tuned by the nanoparticle preparation method. Therefore, the nanoprecipitation and the emulsion-solvent evaporation method were applied to prepare the PNPs. The influence of various parameters, such as concentration of the starting solution, evaporation method and solvent properties on the nanoparticle size, size distribution and morphology were studied. The PNPs were characterized by using atomic force microscopy (AFM), scanning electron microscopy (SEM) and dynamic light scattering (DLS) to assess their size distribution and morphology. The PNPs obtained by nanoprecipitation ranged in size between 90 nm and 130 nm with a very low polydispersity index (PDI < 0.3). On the other hand, the PNPs produced by the emulsion-solvent evaporation method revealed particle sizes around 300 nm with a high PDI value. More detailed information was found in AFM and SEM images, which demonstrated that all these PNPs were regularly spherical. ζ-potential measurements were satisfactory and evidenced the importance of sucrose moiety on the polymeric system, which was responsible for the obtained negative surface charge, providing colloidal stability. The results of this study show that sucrose and cholic acid based polymeric conjugates can be successfully used to prepare PNPs with tunable physicochemical characteristics. In addition, it provides novel information about the materials used and the methods applied. It is hoped that this work will be useful for the development of novel carbohydrate based nanoparticles for biomedical applications, specifically for targeted drug delivery.
Resumo:
The uneven spatial distribution of start-ups and their respective survival may reflect comparative advantages resulting from the local institutional background. For the first time, we explore this idea using Data Envelopment Analysis (DEA) to assess the relative efficiency of Portuguese municipalities in this specific context. We depart from the related literature where expenditure is perceived as a desirable input by choosing a measure of fiscal responsibility and infrastructural variables in the first stage. Comparing results for 2006 and 2010, we find that mean performance decreased substantially 1) with the effects of the Global Financial Crisis, 2) as municipal population increases and 3) as financial independence decreases. A second stage is then performed employing a double-bootstrap procedure to evaluate how the regional context outside the control of local authorities (e.g. demographic characteristics and political preferences) impacts on efficiency.
Resumo:
The public consultation is a methodology for the interaction between the bodies responsible for drafting the law and the parties likely to be affected or to be interested in normative acts in question. This work seeks to encourage the use of public consultation in the process of elaboration of the Brazilian law. Therefore, some aspect of the knowledge area called Science of Legislation, with attention to the concept of “quality of the law” and to of the public consultation tool are addressed. We present the advantages of preparing public consultation mainly in the case of proposals that impose costs or benefits relevant to the economic agents involved in or promoting major change in the distribution of resources in society. Finally, it discusses the Brazilian legislative procedure and what the Brazilian law requires from legislative projects forwarded to the National Congress, as well as build a synthesis of the tools and the exiting possibilities of participation in the Brazilian context of elaboration of norms.