919 resultados para Advent.
Resumo:
Relatório de Estágio submetido à Escola Superior de Teatro e Cinema para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Teatro - especialização em Design de Cena.
Resumo:
Dissertação de Mestrado apresentada ao Instituto Superior de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Marketing Digital, sob orientação do Prof. Paulo Alexandre Pires
Resumo:
The recent technological advancements and market trends are causing an interesting phenomenon towards the convergence of High-Performance Computing (HPC) and Embedded Computing (EC) domains. On one side, new kinds of HPC applications are being required by markets needing huge amounts of information to be processed within a bounded amount of time. On the other side, EC systems are increasingly concerned with providing higher performance in real-time, challenging the performance capabilities of current architectures. The advent of next-generation many-core embedded platforms has the chance of intercepting this converging need for predictable high-performance, allowing HPC and EC applications to be executed on efficient and powerful heterogeneous architectures integrating general-purpose processors with many-core computing fabrics. To this end, it is of paramount importance to develop new techniques for exploiting the massively parallel computation capabilities of such platforms in a predictable way. P-SOCRATES will tackle this important challenge by merging leading research groups from the HPC and EC communities. The time-criticality and parallelisation challenges common to both areas will be addressed by proposing an integrated framework for executing workload-intensive applications with real-time requirements on top of next-generation commercial-off-the-shelf (COTS) platforms based on many-core accelerated architectures. The project will investigate new HPC techniques that fulfil real-time requirements. The main sources of indeterminism will be identified, proposing efficient mapping and scheduling algorithms, along with the associated timing and schedulability analysis, to guarantee the real-time and performance requirements of the applications.
Resumo:
Atualmente, as Tecnologias de Informação (TI) são cada vez mais vitais dentro das organizações. As TI são o motor de suporte do negócio. Para grande parte das organizações, o funcionamento e desenvolvimento das TI têm como base infraestruturas dedicadas (internas ou externas) denominadas por Centro de Dados (CD). Nestas infraestruturas estão concentrados os equipamentos de processamento e armazenamento de dados de uma organização, por isso, são e serão cada vez mais desafiadas relativamente a diversos fatores tais como a escalabilidade, disponibilidade, tolerância à falha, desempenho, recursos disponíveis ou disponibilizados, segurança, eficiência energética e inevitavelmente os custos associados. Com o aparecimento das tecnologias baseadas em computação em nuvem e virtualização, abrese todo um leque de novas formas de endereçar os desafios anteriormente descritos. Perante este novo paradigma, surgem novas oportunidades de consolidação dos CD que podem representar novos desafios para os gestores de CD. Por isso, é no mínimo irrealista para as organizações simplesmente eliminarem os CD ou transforma-los segundo os mais altos padrões de qualidade. As organizações devem otimizar os seus CD, contudo um projeto eficiente desta natureza, com capacidade para suportar as necessidades impostas pelo mercado, necessidades dos negócios e a velocidade da evolução tecnológica, exigem soluções complexas e dispendiosas tanto para a sua implementação como a sua gestão. É neste âmbito que surge o presente trabalho. Com o objetivo de estudar os CD inicia-se um estudo sobre esta temática, onde é detalhado o seu conceito, evolução histórica, a sua topologia, arquitetura e normas existentes que regem os mesmos. Posteriormente o estudo detalha algumas das principais tendências condicionadoras do futuro dos CD. Explorando o conhecimento teórico resultante do estudo anterior, desenvolve-se uma metodologia de avaliação dos CD baseado em critérios de decisão. O estudo culmina com uma análise sobre uma nova solução tecnológica e a avaliação de três possíveis cenários de implementação: a primeira baseada na manutenção do atual CD; a segunda baseada na implementação da nova solução em outro CD em regime de hosting externo; e finalmente a terceira baseada numa implementação em regime de IaaS.
Resumo:
Software development is a discipline that is almost as old as the history of computers. With the advent of the Internet and all of its related technologies, software development has been on high demand. But, and especially in SME (small and medium enterprise), this was not accompanied with a comparable effort to develop a set of sustainable and standardized activities of project management, which lead to increasing inefficiencies and costs. Given the actual economic situation, it makes sense to engage in an effort to reduce said inefficiencies and rising costs. For that end, this work will analyze the current state of software development’s project management processes on a Portuguese SME, along with its problems and inefficiencies in an effort to create a standardized model to manage software development, with special attention given to critical success factors in an agile software development environment, while using the best practices in process modeling. This work also aims to create guidelines to correctly integrate these changes in the existing IS structure of a company.
Resumo:
The advent of bioconjugation impacted deeply the world of sciences and technology. New biomolecules were found, biological processes were understood, and novel methodologies were formed due to the fast expansion of this area. The possibility of creating new effective therapies for diseases like cancer is one of big applications of this now big area of study. Off target toxicity was always the problem of potent small molecules with high activity towards specific tumour targets. However, chemotherapy is now selective due to powerful linkers that connect targeting molecules with affinity to interesting biological receptors and cytotoxic drugs. This linkers must have very specific properties, such as high stability in plasma, no toxicity, no interference with ligand affinity nor drug potency, and at the same time, be able to lyse once inside the target molecule to release the therapeutic warhead. Bipolar environments between tumour intracellular and extracellular medias are usually exploited by this linkers in order to complete this goal. The work done in this thesis explores a new model for that same task, specific cancer drug delivery. Iminoboronates were studied due to its remarkable selective stability towards a wide pH range and endogenous molecules. A fluorescence probe was design to validate this model by creating an Off/On system and determine the payload release location in situ. A process was optimized to synthetize the probe 8-(1-aminoethyl)-7-hydroxy-coumarin (1) through a reductive amination reaction in a microwave reactor with 61 % yield. A method to conjugate this probe to ABBA was also optimized, obtaining the iminoboronate in good yields in mild conditions. The iminoboronate model was studied regarding its stability in several simulated biological environments and each half-life time was determined, showing the conjugate is stable most of the cases except in tumour intracellular systems. The construction of folate-ABBA-coumarin bioconjugate have been made to complete this evaluation. The ability to be uptaken by a cancer cell through endocytosis process and the conjugation delivery of coumarin fluorescence payload are two features to hope for in this construct.
Resumo:
Crowdfunding, as we know it today, is a very recent activity that was born almost accidentally in the end of the 90’s decade. Due to the advent of the internet and the social networks, entrepreneurs are now able to promote their projects to a very large community. Whether it is composed by family, friends, acquaintances or simply people that are interested in the same topic or share the passion, the community is able to fund new ventures by individually investing modest amounts of money. In return, the entrepreneur can offer symbolic rewards, shares or other financial returns. New crowdfunding platforms are born almost every day all over the world, offering a new way of raising capital for their projects or a new way to invest their money in innovative ventures. Although crowdfunding is still finding its place in the financial services, successful cases such as Kickstarter demonstrate the power of the crowd in boosting creativity and productivity, financing thousands of projects by raising millions of dollars from thousands of investors. Due to regulatory restrictions, the most prominent model for now is reward-based crowdfunding, where the investors are prized with symbolic returns or privileged access to the products or services offered by the entrepreneurs. Other models such as peer-to-peer lending are also surging, allowing borrowers access to capital at a lower cost compared to so-called traditional financial institutions, and offering lenders a higher rate of return. But when it comes to offering shares to investors, i.e. using equity-based crowdfunding, entrepreneurs face regulatory obstacles in almost every country, where legislation was passed decades ago with the objective of promoting financially-capable ventures and protecting investors. Access to capital has become more difficult after the global economic recession of 2008, and for most countries it will not get easier in the near future, leaving start-ups and small enterprises with few options to start or expand their operations. In this study we attempt to answer the following research questions: how has equity-based crowdfunding evolved since its creation? Where and how has equity-based crowdfunding been implemented so far? What are the constraints and opportunities for implementing equity-crowdfunding in the world, and more particularly in Portugal? Finally, we will discuss the risks of crowdfunding and reflect on the future of this industry.
Resumo:
Consabido que para uma sociedade organizada se desenvolver política e juridicamente, indispensável se faz a existência de um documento formal, dotado de observância obrigatória, capaz de definir as competências públicas e delimitar os poderes do Estado, resguardando os direitos fundamentais de eventuais abusos dos entes políticos. Este documento é a Constituição, que, em todos os momentos da história, sempre se fez presente nos Estados, mas, inicialmente, não de forma escrita, o que fez com que surgisse, então, o constitucionalismo, movimento que defendia a necessidade de elaboração de constituições escritas, munidas de normatividade e supremacia em relação às demais espécies normativas, que visassem organizar a separação dos poderes estatais e declarar os direitos e as liberdades individuais. Porém, de nada adiantaria a edição de uma Lei Maior sem que houvesse mecanismos de defesa, no intuito de afastar qualquer ameaça à segurança jurídica e à estabilidade social, por conta de alguma lei ou ato normativo contrário aos preceitos estabelecidos na Constituição. O controle de constitucionalidade, pilar do Estado de Direito, consiste em verificar a compatibilidade entre uma lei ou qualquer ato normativo infraconstitucional e a Lei Excelsa e, em havendo contraste, a lei ou o ato viciado deverá ser expurgado do ordenamento jurídico, para que a unidade constitucional seja restabelecida. No Brasil, o controle de constitucionalidade foi instituído sob forte influência do modelo norte-americano e obteve diversos tratamentos ao longo das constituições brasileiras, porém, o sistema de fiscalização de constitucionalidade teve seu ápice com o advento da atual Constituição Federal, promulgada em 05.10.88, com a criação de instrumentos processuais inovadores destinados à verificação da constitucionalidade das leis e atos normativos. Além disso, a Carta da República de 1988, ao contrário das anteriores, fortaleceu a figura do Poder Judiciário no contexto político, conferindo, assim, maior autonomia aos magistrados na solução de casos de grande repercussão nacional, redundando em um protagonismo judicial atual. Nesse contexto, o Supremo Tribunal Federal, órgão de cúpula do Judiciário nacional e guardião da Constituição, tem se destacado no cenário nacional, em especial na defesa dos direitos e garantias fundamentais insculpidos na Lei Fundamental, fazendo-se necessária, desta forma, uma análise na jurisprudência da Corte, no sentido de verificar se, de fato, tem havido evolução no controle de constitucionalidade no Brasil ao longo dos últimos anos e, em caso afirmativo, em que circunstâncias isso tem se dado.
Resumo:
The advent of effective combination antiretroviral therapy (ART) in 1996 resulted in fewer patients experiencing clinical events, so that some prognostic analyses of individual cohort studies of human immunodeficiency virus-infected individuals had low statistical power. Because of this, the Antiretroviral Therapy Cohort Collaboration (ART-CC) of HIV cohort studies in Europe and North America was established in 2000, with the aim of studying the prognosis for clinical events in acquired immune deficiency syndrome (AIDS) and the mortality of adult patients treated for HIV-1 infection. In 2002, the ART-CC collected data on more than 12,000 patients in 13 cohorts who had begun combination ART between 1995 and 2001. Subsequent updates took place in 2004, 2006, 2008, and 2010. The ART-CC data base now includes data on more than 70,000 patients participating in 19 cohorts who began treatment before the end of 2009. Data are collected on patient demographics (e.g. sex, age, assumed transmission group, race/ethnicity, geographical origin), HIV biomarkers (e.g. CD4 cell count, plasma viral load of HIV-1), ART regimen, dates and types of AIDS events, and dates and causes of death. In recent years, additional data on co-infections such as hepatitis C; risk factors such as smoking, alcohol and drug use; non-HIV biomarkers such as haemoglobin and liver enzymes; and adherence to ART have been collected whenever available. The data remain the property of the contributing cohorts, whose representatives manage the ART-CC via the steering committee of the Collaboration. External collaboration is welcomed. Details of contacts are given on the ART-CC website (www.art-cohort-collaboration.org).
Resumo:
The advent of retrievable caval filters was a game changer in the sense, that the previously irreversible act of implanting a medical device into the main venous blood stream of the body requiring careful evaluation of the pros and cons prior to execution suddenly became a "reversible" procedure where potential hazards in the late future of the patient lost most of their weight at the time of decision making. This review was designed to assess the rate of success with late retrieval of so called retrievable caval filters in order to get some indication about reasonable implant duration with respect to relatively "easy" implant removal with conventional means, i.e., catheters, hooks and lassos. A PubMed search (www.pubmed.gov) was performed with the search term "cava filter retrieval after 30 days clinical", and 20 reports between 1994 and 2013 dealing with late retrieval of caval filters were identified, covering approximately 7,000 devices with 600 removed filters. The maximal duration of implant reported is 2,599 days and the maximal implant duration of removed filters is also 2,599 days. The maximal duration reported with standard retrieval techniques, i.e., catheter, hook and/or lasso, is 475 days, whereas for the retrievals after this period more sophisticated techniques including lasers, etc. were required. The maximal implant duration for series with 100% retrieval accounts for 84 days, which is equivalent to 12 weeks or almost 3 months. We conclude that retrievable caval filters often become permanent despite the initial decision of temporary use. However, such "forgotten" retrievable devices can still be removed with a great chance of success up to three months after implantation. Conventional percutaneous removal techniques may be sufficient up to sixteen months after implantation whereas more sophisticated catheter techniques have been shown to be successful up to 83 months or more than seven years of implant duration. Tilting, migrating, or misplaced devices should be removed early on, and replaced if indicated with a device which is both, efficient and retrievable.
Resumo:
Bioterrorism literally means using microorganisms or infected samples to cause terror and panic in populations. Bioterrorism had already started 14 centuries before Christ, when the Hittites sent infected rams to their enemies. However, apart from some rare well-documented events, it is often very difficult for historians and microbiologists to differentiate natural epidemics from alleged biological attacks, because: (i) little information is available for times before the advent of modern microbiology; (ii) truth may be manipulated for political reasons, especially for a hot topic such as a biological attack; and (iii) the passage of time may also have distorted the reality of the past. Nevertheless, we have tried to provide to clinical microbiologists an overview of some likely biological warfare that occurred before the 18th century and that included the intentional spread of epidemic diseases such as tularaemia, plague, malaria, smallpox, yellow fever, and leprosy. We also summarize the main events that occurred during the modern microbiology era, from World War I to the recent 'anthrax letters' that followed the World Trade Center attack of September 2001. Again, the political polemic surrounding the use of infectious agents as a weapon may distort the truth. This is nicely exemplified by the Sverdlovsk accident, which was initially attributed by the authorities to a natural foodborne outbreak, and was officially recognized as having a military cause only 13 years later.
Resumo:
The majority of transcatheter aortic valve implantations, structural heart procedures and the newly developed transcatheter mitral valve repair and replacement are traditionally performed either through a transfemoral or a transapical access site, depending on the presence of severe peripheral vascular disease or anatomic limitations. The transapical approach, which carries specific advantages related to its antegrade nature and the short distance between the introduction site and the cardiac target, is traditionally performed through a left anterolateral mini-thoracotomy and requires rib retractors, soft tissue retractors and reinforced apical sutures to secure, at first, the left ventricular apex for the introduction of the stent-valve delivery systems and then to seal the access site at the end of the procedure. However, despite the advent of low-profile apical sheaths and newly designed delivery systems, the apical approach represents a challenge for the surgeon, as it has the risk of apical tear, life-threatening apical bleeding, myocardial damage, coronary damage and infections. Last but not least, the use of large-calibre stent-valve delivery systems and devices through standard mini-thoracotomies compromises any attempt to perform transapical transcatheter structural heart procedures entirely percutaneously, as happens with the transfemoral access site, or via a thoracoscopic or a miniaturised video-assisted percutaneous technique. During the past few years, prototypes of apical access and closure devices for transapical heart valve procedures have been developed and tested to make this standardised successful procedure easier. Some of them represent an important step towards the development of truly percutaneous transcatheter transapical heart valve procedures in the clinical setting.
Resumo:
Commençant par : «... siecle des siecles. Les tiens benefices et biens faicts ordinaires, desquels nous exillez... » et finissant par : «... conduiz nous au pays et region qui donne eternelle joye. Amen. Fin du premier livre du Pain cotidien en la reffection de l'ame. Et contient ce premier livre depuys le premier dismanche de l'advent Nostre Seigneur jusques au jour de l'Epiphaine, qui est la feste des Roys » . Incomplet au commencement.
Resumo:
Linear alkylbenzenes, LAB, formed by the Alel3 or HF catalyzed alkylation of benzene are common raw materials for surfactant manufacture. Normally they are sulphonated using S03 or oleum to give the corresponding linear alkylbenzene sulphonates In >95 % yield. As concern has grown about the environmental impact of surfactants,' questions have been raised about the trace levels of unreacted raw materials, linear alkylbenzenes and minor impurities present in them. With the advent of modem analytical instruments and techniques, namely GCIMS, the opportunity has arisen to identify the exact nature of these impurities and to determine the actual levels of them present in the commercial linear ,alkylbenzenes. The object of the proposed study was to separate, identify and quantify major and minor components (1-10%) in commercial linear alkylbenzenes. The focus of this study was on the structure elucidation and determination of impurities and on the qualitative determination of them in all analyzed linear alkylbenzene samples. A gas chromatography/mass spectrometry, (GCIMS) study was performed o~ five samples from the same manufacturer (different production dates) and then it was followed by the analyses of ten commercial linear alkylbenzenes from four different suppliers. All the major components, namely linear alkylbenzene isomers, followed the same elution pattern with the 2-phenyl isomer eluting last. The individual isomers were identified by interpretation of their electron impact and chemical ionization mass spectra. The percent isomer distribution was found to be different from sample to sample. Average molecular weights were calculated using two methods, GC and GCIMS, and compared with the results reported on the Certificate of Analyses (C.O.A.) provided by the manufacturers of commercial linear alkylbenzenes. The GC results in most cases agreed with the reported values, whereas GC/MS results were significantly lower, between 0.41 and 3.29 amu. The minor components, impurities such as branched alkylbenzenes and dialkyltetralins eluted according to their molecular weights. Their fragmentation patterns were studied using electron impact ionization mode and their molecular weight ions confirmed by a 'soft ionization technique', chemical ionization. The level of impurities present i~ the analyzed commercial linear alkylbenzenes was expressed as the percent of the total sample weight, as well as, in mg/g. The percent of impurities was observed to vary between 4.5 % and 16.8 % with the highest being in sample "I". Quantitation (mg/g) of impurities such as branched alkylbenzenes and dialkyltetralins was done using cis/trans-l,4,6,7-tetramethyltetralin as an internal standard. Samples were analyzed using .GC/MS system operating under full scan and single ion monitoring data acquisition modes. The latter data acquisition mode, which offers higher sensitivity, was used to analyze all samples under investigation for presence of linear dialkyltetralins. Dialkyltetralins were reported quantitatively, whereas branched alkylbenzenes were reported semi-qualitatively. The GC/MS method that was developed during the course of this study allowed identification of some other trace impurities present in commercial LABs. Compounds such as non-linear dialkyltetralins, dialkylindanes, diphenylalkanes and alkylnaphthalenes were identified but their detailed structure elucidation and the quantitation was beyond the scope of this study. However, further investigation of these compounds will be the subject of a future study.
Resumo:
The Dudding group is interested in the application of Density Functional Theory (DFT) in developing asymmetric methodologies, and thus the focus of this dissertation will be on the integration of these approaches. Several interrelated subsets of computer aided design and implementation in catalysis have been addressed during the course of these studies. The first of the aims rested upon the advancement of methodologies for the synthesis of biological active C(1)-chiral 3-methylene-indan-1-ols, which in practice lead to the use of a sequential asymmetric Yamamoto-Sakurai-Hosomi allylation/Mizoroki Heck reaction sequence. An important aspect of this work was the utilization of ortho-substituted arylaldehyde reagents which are known to be a problematic class of substrates for existing asymmetric allylation approaches. The second phase of my research program lead to the further development of asymmetric allylation methods using o-arylaldehyde substrates for synthesis of chiral C(3)-substituted phthalides. Apart from the de novo design of these chemistries in silico, which notably utilized water-tolerant, inexpensive, and relatively environmental benign indium metal, this work represented the first computational study of a stereoselective indium-mediated process. Following from these discoveries was the advent of a related, yet catalytic, Ag(I)-catalyzed approach for preparing C(3)-substituted phthalides that from a practical standpoint was complementary in many ways. Not only did this new methodology build upon my earlier work with the integrated (experimental/computational) use of the Ag(I)-catalyzed asymmetric methods in synthesis, it provided fundamental insight arrived at through DFT calculations, regarding the Yamamoto-Sakurai-Hosomi allylation. The development of ligands for unprecedented asymmetric Lewis base catalysis, especially asymmetric allylations using silver and indium metals, followed as a natural extension from these earlier discoveries. To this end, forthcoming as well was the advancement of a family of disubstituted (N-cyclopropenium guanidine/N-imidazoliumyl substituted cyclopropenylimine) nitrogen adducts that has provided fundamental insight into chemical bonding and offered an unprecedented class of phase transfer catalysts (PTC) having far-reaching potential. Salient features of these disubstituted nitrogen species is unprecedented finding of a cyclopropenium based C-H•••πaryl interaction, as well, the presence of a highly dissociated anion projected them to serve as a catalyst promoting fluorination reactions. Attracted by the timely development of these disubstituted nitrogen adducts my last studies as a PhD scholar has addressed the utility of one of the synthesized disubstituted nitrogen adducts as a valuable catalyst for benzylation of the Schiff base N-diphenyl methylene glycine ethyl ester. Additionally, the catalyst was applied for benzylic fluorination, emerging from this exploration was successful fluorination of benzyl bromide and its derivatives in high yields. A notable feature of this protocol is column-free purification of the product and recovery of the catalyst to use in a further reaction sequence.