85 resultados para NONLINEAR SIGMA-MODELS
Resumo:
In this thesis a semi-automated cell analysis system is described through image processing. To achieve this, an image processing algorithm was studied in order to segment cells in a semi-automatic way. The main goal of this analysis is to increase the performance of cell image segmentation process, without affecting the results in a significant way. Even though, a totally manual system has the ability of producing the best results, it has the disadvantage of taking too long and being repetitive, when a large number of images need to be processed. An active contour algorithm was tested in a sequence of images taken by a microscope. This algorithm, more commonly known as snakes, allowed the user to define an initial region in which the cell was incorporated. Then, the algorithm would run several times, making the initial region contours to converge to the cell boundaries. With the final contour, it was possible to extract region properties and produce statistical data. This data allowed to say that this algorithm produces similar results to a purely manual system but at a faster rate. On the other hand, it is slower than a purely automatic way but it allows the user to adjust the contour, making it more versatile and tolerant to image variations.
Resumo:
Theoretical epidemiology aims to understand the dynamics of diseases in populations and communities. Biological and behavioral processes are abstracted into mathematical formulations which aim to reproduce epidemiological observations. In this thesis a new system for the self-reporting of syndromic data — Influenzanet — is introduced and assessed. The system is currently being extended to address greater challenges of monitoring the health and well-being of tropical communities.(...)
Resumo:
"Amyotrophic Lateral Sclerosis (ALS) is the most severe and common adult onset disorder that affects motor neurons in the spinal cord, brainstem and cortex, resulting in progressive weakness and death from respiratory failure within two to five years of symptoms onset(...)
Resumo:
Na fase de crise económica onde nos encontramos, a necessidade de distinguir da concorrência é essencial, essa vantagem competitiva pode ser o caminho para o sucesso de uma empresa, uma forma de conseguir essa vantagem competitiva passa pela redução dos desperdícios através da melhoria dos processos. É neste ponto que a metodologia Seis Sigma se torna uma ferramenta de elevada importância, permitindo optimizar um processo até a um nível com 3,4 defeitos por milhão de oportunidades. Neste projecto foi utilizado o método Define, Measure, Analyze, Improve, Control (DMAIC) para redução dos desperdícios numa fábrica de embalagens flexíveis em 25%, mais concretamente no processo de impressão em Rotogravura, durante o projecto foram identificados os principais problemas, sugeridas, implementadas e analisadas diversas medidas de melhoria para redução do desperdício e consequente aumento do nível Sigma. Na fase Definir são determinados os requisitos do projecto assim como foi feita uma compreensão global do problema em estudo. Na fase Medir é calculado o desempenho do processo. Na fase Analisar são identificadas as principais causas do problema. É na fase Melhorar que são implementadas as acções de melhoria, e por fim na fase Controlar são implementadas acções de controlo sobre o processo para possibilitar acções quando o mesmo desvia de um normal desempenho.
Resumo:
A globalização da indústria leva a ambientes industriais cada vez mais competitivos em que a redução de desperdícios e melhoria dos sistemas produtivos e logísticos tornam-se fatores críticos para o sucesso e sustentabilidade de uma organização. Desta forma, a presente dissertação tem como âmbito a melhoria contínua em sistemas logísticos de produção através de uma perspetiva Lean Seis Sigma (modelo híbrido) testado e implementado na Visteon Portuguesa Lda. Neste modelo proposto serão identificadas oportunidades de melhoria no fluxo logístico de produção, no sistema de armazenamento e no processo produtivo e a sua implementação. Estas serão identificadas com base na elaboração do Value Stream Mapping (VSM) do processo produtivo do cluster B299 High. Posteriormente, serão aplicados métodos e ferramentas Lean e Seis Sigma que permitirão atingir o objetivo estipulado para cada uma. A nível do fluxo logístico de produção, foi introduzido o Systematic Layout Planning (SLP) de forma a estudar a configuração existente e as suas restrições, bem como a existência de um rearranjo na mesma que seja vantajoso para a redução do tempo despendido em transporte de materiais entre zonas da configuração. Com o intuito de minimizar o manuseamento dos materiais foi desenvolvido e proposto um novo sistema de transporte. Para a melhoria do sistema de armazenagem, foram criados armazéns distintos dependendo do tipo de material armazenado, e uma análise ABC para identificar quais os produtos com maior número de movimentações e assim, definir qual a melhor configuração das localizações nas racks. Para o abastecimento dos materiais, torna-se também necessário a conceção e implementação de um sistema order picking, tendo como objetivo a agilização do sistema de armazenagem. Por último, foi desenvolvido um estudo DMAIC para melhorar o processo produtivo, com o intuito de aumentar a performance do mesmo através da redução de scrap.
Resumo:
Nowadays, a significant increase on the demand for interoperable systems for exchanging data in business collaborative environments has been noticed. Consequently, cooperation agreements between each of the involved enterprises have been brought to light. However, due to the fact that even in a same community or domain, there is a big variety of knowledge representation not semantically coincident, which embodies the existence of interoperability problems in the enterprises information systems that need to be addressed. Moreover, in relation to this, most organizations face other problems about their information systems, as: 1) domain knowledge not being easily accessible by all the stakeholders (even intra-enterprise); 2) domain knowledge not being represented in a standard format; 3) and even if it is available in a standard format, it is not supported by semantic annotations or described using a common and understandable lexicon. This dissertation proposes an approach for the establishment of an enterprise reference lexicon from business models. It addresses the automation in the information models mapping for the reference lexicon construction. It aggregates a formal and conceptual representation of the business domain, with a clear definition of the used lexicon to facilitate an overall understanding by all the involved stakeholders, including non-IT personnel.
Resumo:
The computational power is increasing day by day. Despite that, there are some tasks that are still difficult or even impossible for a computer to perform. For example, while identifying a facial expression is easy for a human, for a computer it is an area in development. To tackle this and similar issues, crowdsourcing has grown as a way to use human computation in a large scale. Crowdsourcing is a novel approach to collect labels in a fast and cheap manner, by sourcing the labels from the crowds. However, these labels lack reliability since annotators are not guaranteed to have any expertise in the field. This fact has led to a new research area where we must create or adapt annotation models to handle these weaklylabeled data. Current techniques explore the annotators’ expertise and the task difficulty as variables that influences labels’ correction. Other specific aspects are also considered by noisy-labels analysis techniques. The main contribution of this thesis is the process to collect reliable crowdsourcing labels for a facial expressions dataset. This process consists in two steps: first, we design our crowdsourcing tasks to collect annotators labels; next, we infer the true label from the collected labels by applying state-of-art crowdsourcing algorithms. At the same time, a facial expression dataset is created, containing 40.000 images and respective labels. At the end, we publish the resulting dataset.
Resumo:
Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.
Resumo:
Com a crescente competitividade dos mercados, em parte devido à conjuntura económica atual, as organizações viram-se obrigadas a adotar estratégias de gestão que lhes permitissem reduzir custos e aumentar a eficácia e eficiência dos seus processos. Como tal, são cada vez mais as empresas que apostam em práticas de melhoria contínua, para de um modo estruturado fazerem face ao desperdício gerado ao longo da cadeia de abastecimento, bem como à variabilidade dos seus processos. Uma das abordagens frequentemente utilizada dentro das empresas industriais é o Lean Seis Sigma. O ciclo DMAIC é um método sequencial que por meio das etapas Define, Measure, Analyse, Improve e Control, auxilia a implementação estruturada de iniciativas de melhoria contínua, tais como os projetos Lean Seis Sigma. O objetivo do caso de estudo da presente dissertação é, através da criação de stocks de segurança e da otimização do nível de stocks de produto acabado, alcançar uma redução do valor financeiro imobilizado no armazém de produto final de uma empresa da indústria vidreira. Nesse sentido, foram aplicadas diversas ferramentas Lean e Seis Sigma com o intuito de recolher dados válidos, analisar as causas da raiz do problema e ulteriormente propor as devidas melhorias. Constatou-se que a inexistência de stocks de segurança, bem como a ausência de uma classificação dos stocks de acordo a média diária de expedições, influi negativamente na organização dos níveis de stock de uma empresa, levando a que a rotatividade dos artigos seja reduzida. Consequentemente, a aplicação do ciclo DMAIC, com especial enfoque na Análise ABC como ferramenta de gestão e controlo de stocks, gera benefícios que podem ser traduzidos para outras organizações, ao nível do aumento da qualidade e da satisfação dos clientes.
Resumo:
The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.
Resumo:
This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.
Resumo:
The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.
Resumo:
Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.
Resumo:
The work described in this thesis was performed at the Laboratory for Intense Lasers (L2I) of Instituto Superior Técnico, University of Lisbon (IST-UL). Its main contribution consists in the feasibility study of the broadband dispersive stages for an optical parametric chirped pulse amplifier based on the nonlinear crystal yttrium calcium oxi-borate (YCOB). In particular, the main goal of this work consisted in the characterization and implementation of the several optical devices involved in pulse expansion and compression of the amplified pulses to durations of the order of a few optical cycles (20 fs). This type of laser systems find application in fields such as medicine, telecommunications and machining, which require high energy, ultrashort (sub-100 fs) pulses. The main challenges consisted in the preliminary study of the performance of the broadband amplifier, which is essential for successfully handling pulses with bandwidths exceeding 100 nm when amplified from the μJ to 20 mJ per pulse. In general, the control, manipulation and characterization of optical phenomena on the scale of a few tens of fs and powers that can reach the PW level are extremely difficult and challenging due to the complexity of the phenomena of radiation-matter interaction and their nonlinearities, observed at this time scale and power level. For this purpose the main dispersive components were characterized in detail, specifically addressing the demonstration of pulse expansion and compression. The tested bandwidths are narrower than the final ones, in order to confirm the parameters of these elements and predict the performance for the broadband pulses. The work performed led to additional tasks such as a detailed characterization of laser oscillator seeding the laser chain and the detection and cancelling of additional sources of dispersion.
Resumo:
Simulated moving bed (SMB) chromatography is attracting more and more attention since it is a powerful technique for complex separation tasks. Nowadays, more than 60% of preparative SMB units are installed in the pharmaceutical and in the food in- dustry [SDI, Preparative and Process Liquid Chromatography: The Future of Process Separations, International Strategic Directions, Los Angeles, USA, 2002. http://www. strategicdirections.com]. Chromatography is the method of choice in these ¯elds, be- cause often pharmaceuticals and ¯ne-chemicals have physico-chemical properties which di®er little from those of the by-products, and they may be thermally instable. In these cases, standard separation techniques as distillation and extraction are not applicable. The noteworthiness of preparative chromatography, particulary SMB process, as a sep- aration and puri¯cation process in the above mentioned industries has been increasing, due to its °exibility, energy e±ciency and higher product purity performance. Consequently, a new SMB paradigm is requested by the large number of potential small- scale applications of the SMB technology, which exploits the °exibility and versatility of the technology. In this new SMB paradigm, a number of possibilities for improving SMB performance through variation of parameters during a switching interval, are pushing the trend toward the use of units with smaller number of columns because less stationary phase is used and the setup is more economical. This is especially important for the phar- maceutical industry, where SMBs are seen as multipurpose units that can be applied to di®erent separations in all stages of the drug-development cycle. In order to reduce the experimental e®ort and accordingly the coast associated with the development of separation processes, simulation models are intensively used. One impor- tant aspect in this context refers to the determination of the adsorption isotherms in SMB chromatography, where separations are usually carried out under strongly nonlinear conditions in order to achieve higher productivities. The accurate determination of the competitive adsorption equilibrium of the enantiomeric species is thus of fundamental importance to allow computer-assisted optimization or process scale-up. Two major SMB operating problems are apparent at production scale: the assessment of product quality and the maintenance of long-term stable and controlled operation. Constraints regarding product purity, dictated by pharmaceutical and food regulatory organizations, have drastically increased the demand for product quality control. The strict imposed regulations are increasing the need for developing optically pure drugs.(...)