959 resultados para Polynomial distributed lag models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis a semi-automated cell analysis system is described through image processing. To achieve this, an image processing algorithm was studied in order to segment cells in a semi-automatic way. The main goal of this analysis is to increase the performance of cell image segmentation process, without affecting the results in a significant way. Even though, a totally manual system has the ability of producing the best results, it has the disadvantage of taking too long and being repetitive, when a large number of images need to be processed. An active contour algorithm was tested in a sequence of images taken by a microscope. This algorithm, more commonly known as snakes, allowed the user to define an initial region in which the cell was incorporated. Then, the algorithm would run several times, making the initial region contours to converge to the cell boundaries. With the final contour, it was possible to extract region properties and produce statistical data. This data allowed to say that this algorithm produces similar results to a purely manual system but at a faster rate. On the other hand, it is slower than a purely automatic way but it allows the user to adjust the contour, making it more versatile and tolerant to image variations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theoretical epidemiology aims to understand the dynamics of diseases in populations and communities. Biological and behavioral processes are abstracted into mathematical formulations which aim to reproduce epidemiological observations. In this thesis a new system for the self-reporting of syndromic data — Influenzanet — is introduced and assessed. The system is currently being extended to address greater challenges of monitoring the health and well-being of tropical communities.(...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Amyotrophic Lateral Sclerosis (ALS) is the most severe and common adult onset disorder that affects motor neurons in the spinal cord, brainstem and cortex, resulting in progressive weakness and death from respiratory failure within two to five years of symptoms onset(...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, a significant increase on the demand for interoperable systems for exchanging data in business collaborative environments has been noticed. Consequently, cooperation agreements between each of the involved enterprises have been brought to light. However, due to the fact that even in a same community or domain, there is a big variety of knowledge representation not semantically coincident, which embodies the existence of interoperability problems in the enterprises information systems that need to be addressed. Moreover, in relation to this, most organizations face other problems about their information systems, as: 1) domain knowledge not being easily accessible by all the stakeholders (even intra-enterprise); 2) domain knowledge not being represented in a standard format; 3) and even if it is available in a standard format, it is not supported by semantic annotations or described using a common and understandable lexicon. This dissertation proposes an approach for the establishment of an enterprise reference lexicon from business models. It addresses the automation in the information models mapping for the reference lexicon construction. It aggregates a formal and conceptual representation of the business domain, with a clear definition of the used lexicon to facilitate an overall understanding by all the involved stakeholders, including non-IT personnel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computational power is increasing day by day. Despite that, there are some tasks that are still difficult or even impossible for a computer to perform. For example, while identifying a facial expression is easy for a human, for a computer it is an area in development. To tackle this and similar issues, crowdsourcing has grown as a way to use human computation in a large scale. Crowdsourcing is a novel approach to collect labels in a fast and cheap manner, by sourcing the labels from the crowds. However, these labels lack reliability since annotators are not guaranteed to have any expertise in the field. This fact has led to a new research area where we must create or adapt annotation models to handle these weaklylabeled data. Current techniques explore the annotators’ expertise and the task difficulty as variables that influences labels’ correction. Other specific aspects are also considered by noisy-labels analysis techniques. The main contribution of this thesis is the process to collect reliable crowdsourcing labels for a facial expressions dataset. This process consists in two steps: first, we design our crowdsourcing tasks to collect annotators labels; next, we infer the true label from the collected labels by applying state-of-art crowdsourcing algorithms. At the same time, a facial expression dataset is created, containing 40.000 images and respective labels. At the end, we publish the resulting dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The particular characteristics and affordances of technologies play a significant role in human experience by defining the realm of possibilities available to individuals and societies. Some technological configurations, such as the Internet, facilitate peer-to-peer communication and participatory behaviors. Others, like television broadcasting, tend to encourage centralization of creative processes and unidirectional communication. In other instances still, the affordances of technologies can be further constrained by social practices. That is the case, for example, of radio which, although technically allowing peer-to-peer communication, has effectively been converted into a broadcast medium through the legislation of the airwaves. How technologies acquire particular properties, meanings and uses, and who is involved in those decisions are the broader questions explored here. Although a long line of thought maintains that technologies evolve according to the logic of scientific rationality, recent studies demonstrated that technologies are, in fact, primarily shaped by social forces in specific historical contexts. In this view, adopted here, there is no one best way to design a technological artifact or system; the selection between alternative designs—which determine the affordances of each technology—is made by social actors according to their particular values, assumptions and goals. Thus, the arrangement of technical elements in any technological artifact is configured to conform to the views and interests of those involved in its development. Understanding how technologies assume particular shapes, who is involved in these decisions and how, in turn, they propitiate particular behaviors and modes of organization but not others, requires understanding the contexts in which they are developed. It is argued here that, throughout the last century, two distinct approaches to the development and dissemination of technologies have coexisted. In each of these models, based on fundamentally different ethoi, technologies are developed through different processes and by different participants—and therefore tend to assume different shapes and offer different possibilities. In the first of these approaches, the dominant model in Western societies, technologies are typically developed by firms, manufactured in large factories, and subsequently disseminated to the rest of the population for consumption. In this centralized model, the role of users is limited to selecting from the alternatives presented by professional producers. Thus, according to this approach, the technologies that are now so deeply woven into human experience, are primarily shaped by a relatively small number of producers. In recent years, however, a group of three interconnected interest groups—the makers, hackerspaces, and open source hardware communities—have increasingly challenged this dominant model by enacting an alternative approach in which technologies are both individually transformed and collectively shaped. Through a in-depth analysis of these phenomena, their practices and ethos, it is argued here that the distributed approach practiced by these communities offers a practical path towards a democratization of the technosphere by: 1) demystifying technologies, 2) providing the public with the tools and knowledge necessary to understand and shape technologies, and 3) encouraging citizen participation in the development of technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO - Enquadramento: O envelhecimento dos indivíduos nos países mais desenvolvidos e o aumento da incidência de doenças crónicas associadas a estados de dependência e incapacidade têm contribuído para o desenho e implementação de novas políticas de saúde e sociais. Assiste-se, por isso, atualmente, a uma mudança no paradigma da procura de cuidados de saúde, sendo crescente a procura de cuidados de longa duração ou cuidados continuados. O desenvolvimento e implementação de novos modelos de prestação de cuidados de saúde pretendem dar resposta à crescente procura de cuidados continuados, bem como promover a eficiência dos serviços e a disponibilização de camas nos hospitais, retirando dos serviços de agudos as pessoas que não necessitam de cuidados hospitalares, mas sim de cuidados continuados. Neste contexto foi criada em Portugal a Rede Nacional de Cuidados Continuados Integrados (RNCCI), como resposta ao aumento do número de pessoas em situação de dependência, e que necessitam tanto de cuidados de saúde como sociais, e à necessidade de reorganizar e promover a eficiência dos serviços de internamento hospitalar. Objetivo: Determinar o impacto da RNCCI na demora média hospitalar, no período de tempo compreendido entre 1 de Janeiro de 2009 e 31 de Junho de 2011. Métodos: O estudo realizado, com base na revisão da literatura, descreve os principais aspectos referentes ao envelhecimento dos indivíduos e aos cuidados continuados. Foram descritos diferentes modelos e programas organizacionais de prestação de cuidados continuados e o seu impacto na demora média hospitalar. Foi determinada a população em estudo no período de tempo compreendido entre 1 de Janeiro de 2009 e 31 de Junho de 2011. A população foi caraterizada de acordo com o ano e distribuída por dez trimestres para melhor tratamento estatístico e leitura dos dados. Foi considerado o sexo e a faixa etária dos indivíduos sinalizados, de acordo com o GDH de internamento hospitalar e respetiva sub-região de saúde. Foi comparada por trimestre a demora média dos internamentos hospitalares e a demora média hospitalar dos episódios referenciados a nível nacional e ao nível das sub-regiões de saúde. Foram caraterizados os GDH que representam 50% das sinalizações. Foram analisados, por semestre, os três GDH com maior número de referenciações para a RNCCI de acordo com as diferentes regiões de saúde, comparando as respetivas demoras médias nacionais e regionais. Resultados: No periodo de tempo em análise foi verificado que a população com maior utilização dos serviços da RNCCI encontra-se na faixa etária entre 65 ou mais anos, com 79,4% do total de sinalizações efetuadas. Tendo 50% das sinalizações sido referentes aos GDH 14, GDH 211, GDH 533, GDH 818, GDH 810 e GDH 209. Foi apurada uma demora média nacional compreendida entre os 7,3 dias e os 7,7 dias, comparativamente a uma demora média dos episódios referenciados para a RNCCI compreendida entre os 21,9 dias e os 33 dias, para o mesmo período de tempo. Em termos regionais a região de LVT apresenta os valores de demora média mais elevados, com um intervalo entre os 28,8 dias e os 50,3 dias de demora média. Para o GDH 14 foi observada uma demora média dos episódios referenciados compreendida entre os 14,4 dias e os 26,7 dias. No mesmo período de tempo o a demora média nacional para o mesmo GDH situava-se entre os 9,8 dias e os 10,2 dias. Para o GDH 211 foi observada uma demora média dos episódios referenciados compreendida entre os 17,2 dias e os 28,9 dias. Comparativamente a demora média nacional para o mesmo GDH situava-se entre os 12,5 dias e os 13,5 dias. Para o GDH 533 foi observada uma demora média dos episódios referenciados compreendida entre os 23,3 dias e os 52,7 dias. Comparativamente, no mesmo período de tempo, a demora média nacional para o mesmo GDH situava-se entre os 18,7 dias e os 19,7 dias. Conclusões: Foi possível concluir, quanto ao impacto da RNCCI na demora média hospitalar, que a demora média dos episódios referenciados para a Rede é superior à demora média nacional em todo o período de tempo em análise. Relativamente à demora média dos GDH com maior número de referenciações, os GDH 14, 211 e 533, verifica-se que todos eles apresentam uma demora média de referenciação superior à demora média nacional, e demora média regional para o mesmo GDH, em todo o período de tempo do estudo. Ou seja, foi possível verificar que a demora média para indivíduos com o mesmo GDH é superior nos que são referenciados para a RNCCI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO - Contexto Os indivíduos, tal como as instituições, não são imunes a incentivos. No entanto, enquanto os modelos de incentivos das instituições têm sido alvo de diferentes evoluções, o mesmo não se verificou ao nível dos profissionais. Esta situação não se figura compatível com a complexidade de gestão de recursos humanos, devendo ser obviada para potenciar o alinhamento entre os interesses institucionais e os dos próprios profissionais. Objectivos Estudar a atribuição de incentivos a profissionais de saúde no contexto de organizações com integração vertical de cuidados. Metodologia A metodologia adoptada compreendeu três fases. Numa primeira procedeu-se à revisão sistemática de literatura relativa à: (1) construção de modelos de incentivo a profissionais em diferentes sistemas de saúde e tipo de prestadores; e (2) identificação de medidas de custo-efectividade comprovada. Tendo por base esta evidência, a par de documentação oficial ao nível do modelo de financiamento das ULS, procedeu-se, numa segunda fase, à construção de um modelo de incentivo base com recurso à ferramenta Microsoft Excel. Por último, numa terceira etapa, procedeu-se à adaptação do modelo base construído na etapa transacta tendo por base informação obtida mediante a realização de um estudo retrospectivo in loco na ULS do Baixo Alentejo (ULSBA). Em adição, procedeu-se à estimativa do impacto na perspectiva da ULS e dos profissionais para o cenário base e diversas análises de sensibilidade. Resultados No que respeita à estrutura, o modelo base de incentivos a profissionais apresenta 44 indicadores, distribuídos por cinco dimensões de análise, sendo que 28 indicadores (63,6%) são de processo e 14 (31,8%) de resultado. Relativamente às dimensões em análise, verifica-se uma predominância de indicadores ao nível da dimensão eficiência e qualidade assistencial, totalizando 35 (i.e. 79,5% dos 44 indicadores). No que respeita ao destinatário, 14 indicadores (31,8%) apresentam uma visão holística da ULS, 17 (38,6%) encontram-se adstritos unicamente aos cuidados primários e os remanescentes 13 (29,5%) aos cuidados hospitalares. Cerca de 85% dos actuais incentivos da ULSBA decorre da unidade de pagamento salarial secundada pelo pagamento de suplementos (12%). Não obstante, o estudo retrospectivo da ULSBA confirmou o cenário expectável de ausência de um modelo de incentivo homogéneos e transversal à ULS, transparecendo importantes assimetrias entre diferentes unidades prestadoras e/ou profissionais de saúde. De forma relevante importa apontar a insuficiência de incentivos capitacionais (ao contrário do que sucede com o modelo de incentivo da própria ULSBA) ou adstritos a índices de desempenho. Tendo em consideração o modelo de incentivo concebido e adaptado à realidade da ULSBA, a par do plano de implementação, estima-se que o modelo de incentivos gere: (1) poupanças na perspectiva da ULS (entre 2,5% a 3,5% do orçamento global da ULSBA); e (2) um incremento de remuneração ao nível dos profissionais (entre 5% a 15% do salario base). O supracitado – aparentemente contraditório - decorre da aposta em medidas de custo-efectividade contrastada e um alinhamento entre o modelo proposto e o vigente para o próprio financiamento da unidade, apostando numa clara estratégia de ganhos mútuos. As análises de sensibilidade realizadas permitem conferir a solidez e robustez do modelo a significativas variações em parâmetros chave.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The “CMS Safety Closing Sensors System” (SCSS, or CSS for brevity) is a remote monitoring system design to control safety clearance and tight mechanical movements of parts of the CMS detector, especially during CMS assembly phases. We present the different systems that makes SCSS: its sensor technologies, the readout system, the data acquisition and control software. We also report on calibration and installation details, which determine the resolution and limits of the system. We present as well our experience from the operation of the system and the analysis of the data collected since 2008. Special emphasis is given to study positioning reproducibility during detector assembly and understanding how the magnetic fields influence the detector structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Malaria is a serious problem in the Brazilian Amazon region, and the detection of possible risk factors could be of great interest for public health authorities. The objective of this article was to investigate the association between environmental variables and the yearly registers of malaria in the Amazon region using Bayesian spatiotemporal methods. METHODS: We used Poisson spatiotemporal regression models to analyze the Brazilian Amazon forest malaria count for the period from 1999 to 2008. In this study, we included some covariates that could be important in the yearly prediction of malaria, such as deforestation rate. We obtained the inferences using a Bayesian approach and Markov Chain Monte Carlo (MCMC) methods to simulate samples for the joint posterior distribution of interest. The discrimination of different models was also discussed. RESULTS: The model proposed here suggests that deforestation rate, the number of inhabitants per km², and the human development index (HDI) are important in the prediction of malaria cases. CONCLUSIONS: It is possible to conclude that human development, population growth, deforestation, and their associated ecological alterations are conducive to increasing malaria risk. We conclude that the use of Poisson regression models that capture the spatial and temporal effects under the Bayesian paradigm is a good strategy for modeling malaria counts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work models the competitive behaviour of individuals who maximize their own utility managing their network of connections with other individuals. Utility is taken as a synonym of reputation in this model. Each agent has to decide between two variables: the quality of connections and the number of connections. Hence, the reputation of an individual is a function of the number and the quality of connections within the network. On the other hand, individuals incur in a cost when they improve their network of contacts. The initial value of the quality and number of connections of each individual is distributed according to an initial (given) distribution. The competition occurs over continuous time and among a continuum of agents. A mean field game approach is adopted to solve the model, leading to an optimal trajectory for the number and quality of connections for each individual.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the boundaries of simplified wind turbine models used to represent the behavior of wind turbines in order to conduct power system stability studies. Based on experimental measurements, the response of recent simplified (also known as generic) wind turbine models that are currently being developed by the International Standard IEC 61400-27 is compared to complex detailed models elaborated by wind turbine manufacturers. This International Standard, whose Technical Committee was convened in October 2009, is focused on defining generic simulation models for both wind turbines (Part 1) and wind farms (Part 2). The results of this work provide an improved understanding of the usability of generic models for conducting power system simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.