67 resultados para Orientació professional -- Models matemàtics
Resumo:
In this thesis a semi-automated cell analysis system is described through image processing. To achieve this, an image processing algorithm was studied in order to segment cells in a semi-automatic way. The main goal of this analysis is to increase the performance of cell image segmentation process, without affecting the results in a significant way. Even though, a totally manual system has the ability of producing the best results, it has the disadvantage of taking too long and being repetitive, when a large number of images need to be processed. An active contour algorithm was tested in a sequence of images taken by a microscope. This algorithm, more commonly known as snakes, allowed the user to define an initial region in which the cell was incorporated. Then, the algorithm would run several times, making the initial region contours to converge to the cell boundaries. With the final contour, it was possible to extract region properties and produce statistical data. This data allowed to say that this algorithm produces similar results to a purely manual system but at a faster rate. On the other hand, it is slower than a purely automatic way but it allows the user to adjust the contour, making it more versatile and tolerant to image variations.
Resumo:
Theoretical epidemiology aims to understand the dynamics of diseases in populations and communities. Biological and behavioral processes are abstracted into mathematical formulations which aim to reproduce epidemiological observations. In this thesis a new system for the self-reporting of syndromic data — Influenzanet — is introduced and assessed. The system is currently being extended to address greater challenges of monitoring the health and well-being of tropical communities.(...)
Resumo:
The increasing use of information and communication technologies (ICT) in diverse professional and personal contexts calls for new knowledge, and a set of abilities, competences and attitudes, for an active and participative citizenship. In this context it is acknowledged that universities have an important role innovating in the educational use of digital media to promote an inclusive digital literacy. The educational potential of digital technologies and resources has been recognized by both researchers and practitioners. Multiple pedagogical models and research approaches have already contributed to put in evidence the importance of adapting instructional and learning practices and processes to concrete contexts and educational goals. Still, academic and scientific communities believe further investments in ICT research is needed in higher education. This study focuses on educational models that may contribute to support digital technology uses, where these can have cognitive and educational relevance when compared to analogical technologies. A teaching and learning model, centered in the active role of the students in the exploration, production, presentation and discussion of interactive multimedia materials, was developed and applied using the internet and exploring emergent semantic hypermedia formats. The research approach focused on the definition of design principles for developing class activities that were applied in three different iterations in undergraduate courses from two institutions, namely the University of Texas at Austin, USA and the University of Lisbon, Portugal. The analysis of this study made possible to evaluate the potential and efficacy of the model proposed and the authoring tool chosen in the support of metacognitive skills and attitudes related to information structuring and management, storytelling and communication, using computers and the internet.
Resumo:
"Amyotrophic Lateral Sclerosis (ALS) is the most severe and common adult onset disorder that affects motor neurons in the spinal cord, brainstem and cortex, resulting in progressive weakness and death from respiratory failure within two to five years of symptoms onset(...)
Resumo:
ABSTRACT - Background: Integration of health care services is emerging as a central challenge of health care delivery, particularly for patients with elderly and complex chronic conditions. In 2003, the World Health Organization (WHO) already began to identify it as one of the key pathways to improve primary care. In 2005, the European Commission declared integrated care as vital for the sustainability of social protection systems in Europe. Nowadays, it is recognized as a core component of health and social care reforms across European countries. Implementing integrated care requires coordination between settings, organizations, providers and professionals. In order to address the challenge of integration in such complex scenario, an effective workforce is required capable of working across interdependent settings. The World Health Report 2006 noted that governments should prepare their workforce and explore what tasks the different levels of health workers are trained to do and are capable of performing (skills mix). Comparatively to other European countries, Portugal is at an early stage in what integrated care is concerned facing a growing elderly population and the subsequent increase in the pressure on institutions and professionals to provide social and medical care in the most cost-effective way. In 2006 the Portuguese government created the Portuguese Network for Integrated Care Development (PNICD) to solve the existing long-term gap in social support and healthcare. On what concerns health workforce, the Portuguese government already recognized the importance of redefine careers keeping professional motivation and satisfaction. Aim of the study: This study aims to contribute new evidence to the debate surrounding integrated care and skills mix policies in Europe. It also seeks to provide the first evidence that incorporates both the current dynamics of implementing integrated care in Portugal and the developments of international literature. The first ambition of our study is to contribute to the growing interest in integrated care and to the ongoing research in this area by identifying its different approaches and retrieve a number of experiences in some European countries. Our second goal of this research is to produce an update on the knowledge developed on skills mix to the international healthcare management community and to policy makers involved in reforming healthcare systems and organizations. To better inform Portuguese health policies makers in a third stage we explore the current dynamics of implementing integrated care in Portugal and contextualize them with the developments reported in the international literature. Methodology: This is essentially an exploratory and descriptive study using qualitative methodology. In order to identify integrated care approaches in Europe, a systematic literature review was undertaken which resulted in a paper published in the Journal of Management and Marketing in Health care titled: Approaches to developing integrated care in Europe: a systematic literature review. This article was recommended and included into a list of references identified by The King's Fund Library. A second systematic literature review was undertaken which resulted in a paper published in the International Journal of Healthcare Management titled: Skills mix in healthcare: An international update for the management debate. Semi-structured interviews were performed on experts representing the regional coordination teams of the Portuguese Network for Integrated Care Development. In a last stage a questionnaire survey was developed based on the findings of both systematic literature reviews and semi-structured interviews. Conclusions: Even though integrated care is a worldwide trend in health care reforms, there is no unique definition. Definitions can be grouped according to their sectorial focus: community-based care, combined health and social care, combined acute and primary care, the integration of providers, and in a more comprehensive approach the whole health system. Indeed, models that seek to apply the principles of integrated care have a similar background and are continually evolving and depend on the different initiatives taken at national level. . Despite the fact that we cannot argue that there is one single set typology of models for integrated care, it is possible to identify and categorize some of the basic approaches that have been taken in attempts to implement integrated care according to: changes in organizational structure, workforce reconfiguring, and changes in the financing system. The systematic literature review on skills mix showed that despite the widely acknowledged interest on skills mix initiatives there is a lack of evidence on skills mix implications, constraints, outcomes, and quality impact that would allow policy makers to take sustained and evidence-based decisions. Within the Portuguese health system, the integrated care approach is rather organizational and financial, whereas little attention is given to workforce integration. On what concerns workforce planning Portugal it is still in the stage of analyzing the acceptability of health workforce skills mix. In line with the international approaches, integration of health and social services and bridging primary and acute care are the main goals of the national government strategy. The findings from our interviews clarify perceptions which show no discrepancy with the related literature but are rather scarce comparing to international experience. Informants hold a realistic but narrow view of integrated care related issues. They seem to be limited to the regional context, requiring a more comprehensive perspective. The questionnaire developed in this thesis is an instrument which, when applied, will allow policy makers to understand the basic set of concepts and managerial motivations behind national and regional integrated care programs. The instrument developed can foster evidence on the three essential components of integrated care policies: organizational, financial, and human resources development, and can give additional input on the context in which integrated care is being developed, the type of providers and organizations involved, barriers and constraints, and the workforce skills mix planning related strategies. The thesis was successful in recognizing differences between countries and interventions and the instrument developed will allow a better comprehension of the international options available and how to address the vital components of integrated care programs.
Resumo:
Nowadays, a significant increase on the demand for interoperable systems for exchanging data in business collaborative environments has been noticed. Consequently, cooperation agreements between each of the involved enterprises have been brought to light. However, due to the fact that even in a same community or domain, there is a big variety of knowledge representation not semantically coincident, which embodies the existence of interoperability problems in the enterprises information systems that need to be addressed. Moreover, in relation to this, most organizations face other problems about their information systems, as: 1) domain knowledge not being easily accessible by all the stakeholders (even intra-enterprise); 2) domain knowledge not being represented in a standard format; 3) and even if it is available in a standard format, it is not supported by semantic annotations or described using a common and understandable lexicon. This dissertation proposes an approach for the establishment of an enterprise reference lexicon from business models. It addresses the automation in the information models mapping for the reference lexicon construction. It aggregates a formal and conceptual representation of the business domain, with a clear definition of the used lexicon to facilitate an overall understanding by all the involved stakeholders, including non-IT personnel.
Resumo:
The computational power is increasing day by day. Despite that, there are some tasks that are still difficult or even impossible for a computer to perform. For example, while identifying a facial expression is easy for a human, for a computer it is an area in development. To tackle this and similar issues, crowdsourcing has grown as a way to use human computation in a large scale. Crowdsourcing is a novel approach to collect labels in a fast and cheap manner, by sourcing the labels from the crowds. However, these labels lack reliability since annotators are not guaranteed to have any expertise in the field. This fact has led to a new research area where we must create or adapt annotation models to handle these weaklylabeled data. Current techniques explore the annotators’ expertise and the task difficulty as variables that influences labels’ correction. Other specific aspects are also considered by noisy-labels analysis techniques. The main contribution of this thesis is the process to collect reliable crowdsourcing labels for a facial expressions dataset. This process consists in two steps: first, we design our crowdsourcing tasks to collect annotators labels; next, we infer the true label from the collected labels by applying state-of-art crowdsourcing algorithms. At the same time, a facial expression dataset is created, containing 40.000 images and respective labels. At the end, we publish the resulting dataset.
Resumo:
Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.
Resumo:
The particular characteristics and affordances of technologies play a significant role in human experience by defining the realm of possibilities available to individuals and societies. Some technological configurations, such as the Internet, facilitate peer-to-peer communication and participatory behaviors. Others, like television broadcasting, tend to encourage centralization of creative processes and unidirectional communication. In other instances still, the affordances of technologies can be further constrained by social practices. That is the case, for example, of radio which, although technically allowing peer-to-peer communication, has effectively been converted into a broadcast medium through the legislation of the airwaves. How technologies acquire particular properties, meanings and uses, and who is involved in those decisions are the broader questions explored here. Although a long line of thought maintains that technologies evolve according to the logic of scientific rationality, recent studies demonstrated that technologies are, in fact, primarily shaped by social forces in specific historical contexts. In this view, adopted here, there is no one best way to design a technological artifact or system; the selection between alternative designs—which determine the affordances of each technology—is made by social actors according to their particular values, assumptions and goals. Thus, the arrangement of technical elements in any technological artifact is configured to conform to the views and interests of those involved in its development. Understanding how technologies assume particular shapes, who is involved in these decisions and how, in turn, they propitiate particular behaviors and modes of organization but not others, requires understanding the contexts in which they are developed. It is argued here that, throughout the last century, two distinct approaches to the development and dissemination of technologies have coexisted. In each of these models, based on fundamentally different ethoi, technologies are developed through different processes and by different participants—and therefore tend to assume different shapes and offer different possibilities. In the first of these approaches, the dominant model in Western societies, technologies are typically developed by firms, manufactured in large factories, and subsequently disseminated to the rest of the population for consumption. In this centralized model, the role of users is limited to selecting from the alternatives presented by professional producers. Thus, according to this approach, the technologies that are now so deeply woven into human experience, are primarily shaped by a relatively small number of producers. In recent years, however, a group of three interconnected interest groups—the makers, hackerspaces, and open source hardware communities—have increasingly challenged this dominant model by enacting an alternative approach in which technologies are both individually transformed and collectively shaped. Through a in-depth analysis of these phenomena, their practices and ethos, it is argued here that the distributed approach practiced by these communities offers a practical path towards a democratization of the technosphere by: 1) demystifying technologies, 2) providing the public with the tools and knowledge necessary to understand and shape technologies, and 3) encouraging citizen participation in the development of technologies.
Resumo:
The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.
Resumo:
This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.
Resumo:
The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.
Resumo:
Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.
Resumo:
RESUMO: Temos assistido a uma evolução impressionante nos laboratórios de análises clínicas, os quais precisam de prestar um serviço de excelência a custos cada vez mais competitivos. Nos laboratórios os sistemas de gestão da qualidade têm uma importância significativa nesta evolução, fundamentalmente pela procura da melhoria continua, que ocorre não só ao nível de processos e técnicas, mas também na qualificação dos diferentes intervenientes. Um dos problemas fundamentais da gestão e um laboratório é a eliminação de desperdícios e erros criando benefícios, conceito base na filosofia LeanThinking isto é “pensamento magro”, pelo que é essencial conseguir monitorizar funções críticas sistematicamente. Esta monitorização, num laboratório cada vez mais focalizado no utente, pode ser efetuada através de sistemas e tecnologias de informação, sendo possível contabilizar número de utentes, horas de maior afluência, tempo médio de permanência na sala de espera, tempo médio para entrega de análises, resultados entregues fora da data prevista, entre outros dados de apoio à decisão. Devem igualmente ser analisadas as reclamações, bem como a satisfação dos utentes quer através do feedback que é transmitido aos funcionários, quer através de questionários de satisfação. Usou-se principalmente dois modelos: um proposto pelo Índice Europeu de Satisfação do Consumidor (ECSI) e o outro de Estrutura Comum de Avaliação (CAF). Introduziram-se igualmente dois questionários: um apresentado em formato digital num posto de colheitas, através de um quiosque eletrónico, e um outro na página da internet do laboratório, ambos como alternativa ao questionário em papel já existente, tendo-se analisado os dados, e retirado as devidas conclusões. Propôs-se e desenvolveu-se um questionário para colaboradores cuja intenção foi a de fornecer dados úteis de apoio à decisão, face à importância dos funcionários na interação com os clientes e na garantia da qualidade ao longo de todo o processo. Avaliaram-se globalmente os resultados sem que tenha sido possível apresentá-los por política interna da empresa, bem como se comentou de forma empírica alguns benefícios deste questionário. Os principais objetivos deste trabalho foram, implementar questionários de satisfação eletrónicos e analisar os resultados obtidos, comparando-os com o estudo ECSI, de forma a acentuar a importância da análise em simultâneo de dois fatores: a motivação profissional e a satisfação do cliente, com o intuito de melhorar os sistemas de apoio à decisão. ------------------------ ABSTRACT: We have witnessed an impressive development in clinical analysis laboratories, which have to provide excellent service at increasingly competitive costs, quality management systems have a significant importance in this evolution, mainly by demanding continuous improvement, which does not occur only in terms of processes and techniques, but also in the qualification of the various stakeholders. One key problem of managing a laboratory is the elimination of waste and errors, creating benefits, concept based on Lean Thinking philosophy, therefore it is essential be able to monitor critical tasks systematically. This monitoring, in an increasingly focused on the user laboratory can be accomplished through information systems and technologies, through which it is possible to account the number of clients, peak times, average length of waiting room stay, average time for delivery analysis, delivered results out of the expected date, among other data that contribute to support decisions, however it is also decisive to analyzed complaint sand satisfaction of users through employees feedback but mainly through satisfaction questionnaires that provides accurate results. We use mainly two models one proposed by the European Index of Consumer Satisfaction (ECSI), directed to the client, and the Common Assessment Framework (CAF), used both in the client as the employees surveys. Introduced two questionnaires in a digital format, one in the central laboratory collect center, through an electronic kiosk and another on the laboratory web page, both as an alternative to survey paper currently used, we analyzed the results, and withdrew the conclusions. It was proposed and developed a questionnaire for employees whose intention would be to provide useful data to decision support, given the importance of employees in customer interaction and quality assurance throughout the whole clinical process, it was evaluated in a general way because it was not possible to show the results, however commented an empirical way some benefits of this questionnaire. The main goals of this study were to implement electronic questionnaires and analyze the results, comparing them with the ECSI, in order to emphasize the importance of analyzing simultaneously professional motivation with customer satisfaction, in order to improve decision support systems.
Resumo:
This research is titled “The Future of Airline Business Models: Which Will Win?” and it is part of the requirements for the award of a Masters in Management from NOVA BSE and another from Luiss Guido Carlo University. The purpose is to elaborate a complete market analysis of the European Air Transportation Industry in order to predict which Airlines, strategies and business models may be successful in the next years. First, an extensive literature review of the business model concept has been done. Then, a detailed overview of the main European Airlines and the strategies that they have been implementing so far has been developed. Finally, the research is illustrated with three case studies