978 resultados para Advanced Transaction Models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO: Apesar de toda a evolução farmacológica e de meios complementares de diagnóstico possível nos últimos anos, o enfarte agudo do miocárdio e a morte súbita continuam a ser a primeira manifestação da aterosclerose coronária para muitos doentes, que estavam previamente assintomáticos. Os exames complementares de diagnóstico tradicionalmente usados para avaliar a presença de doença coronária, baseiam‐se na documentação de isquémia do miocárdio e por este motivo a sua positividade depende da presença de lesões coronárias obstrutivas. As lesões coronárias não obstrutivas estão também frequentemente implicadas no desenvolvimento de eventos coronários. Apesar de o risco absoluto de instabilização por placa ser superior para as lesões mais volumosas e obstrutivas, estas são menos prevalentes do que as placas não obstrutivas e assim, por questões probabilísticas, os eventos coronários resultam com frequência da rotura ou erosão destas últimas. Estudos recentes de imagiologia intracoronária avançada forneceram evidência de que apesar de ser possível identificar algumas características de vulnerabilidade em placas associadas ao desenvolvimento subsequente de eventos coronários, a sua sensibilidade e especificidade é muito baixa para aplicação clínica. Mais do que o risco associado a uma placa em particular, para o doente poderá ser mais importante o risco global da sua árvore coronária reflexo da soma das probabilidade de todas as suas lesões, sendo que quanto maior for a carga aterosclerótica maior será o seu risco. A angio TC cardíaca é a mais recente técnica de imagem não invasiva para o estudo da doença coronária e surgiu nos últimos anos fruto de importantes avanços na tecnologia de TC multidetectores. Estes avanços, permitiram uma progressiva melhoria da resolução espacial e temporal, contribuindo para a melhoria da qualidade dos exames, bem como uma significativa redução da dose de radiação. A par desta evolução tecnológica, foi aumentando a experiência e gerada mais evidência científica, tornando a angio TC cardíaca cada vez mais robusta na avaliação da doença coronária e aumentando a sua aplicabilidade clínica. Mais recentemente apareceram vários trabalhos que validaram o seu valor prognóstico, assinalando a sua chegada à idade adulta. Para além de permitir excluir a presença de doença coronária e de identificar a presença de estenoses significativas, a angio TC cardíaca permite identificar a presença de lesões coronárias não obstrutivas, característica impar desta técnica como modalidade de imagem não invasiva. Ao permitir identificar a totalidade das lesões ateroscleróticas (obstrutivas e não obstrutivas), a 18 angio TC cardíaca poderá fornecer uma quantificação da carga aterosclerótica coronária total, podendo essa identificação ser útil na estratificação dos indivíduos em risco de eventos coronários. Neste trabalho foi possível identificar preditores demográficos e clínicos de uma elevada carga aterosclerótica coronária documentada pela angioTC cardíaca, embora o seu poder discriminativo tenha sido relativamente modesto, mesmo quando agrupados em scores clínicos. Entre os vários scores, o desempenho foi um pouco melhor para o score de risco cardiovascular Heartscore. Estas limitações espelham a dificuldade de prever apenas com base em variáveis clínicas, mesmo quando agrupadas em scores, a presença e extensão da doença coronária. Um dos factores de risco clássicos, a obesidade, parece ter uma relação paradoxal com a carga aterosclerótica, o que pode justificar algumas limitações da estimativa com base em scores clínicos. A diabetes mellitus, por outro lado, foi um dos preditores clínicos mais importantes, funcionando como modelo de doença coronária mais avançada, útil para avaliar o desempenho dos diferentes índices de carga aterosclerótica. Dada a elevada prevalência de placas ateroscleróticas identificáveis por angio TC na árvore coronária, torna-‐se importante desenvolver ferramentas que permitam quantificar a carga aterosclerótica e assim identificar os indivíduos que poderão eventualmente beneficiar de medidas de prevenção mais intensivas. Com este objectivo, foi desenvolvido um índice de carga aterosclerótica que reúne a informação global acerca da localização, do grau de estenose e do tipo de placa, obtida pela angio TC cardíaca, o CT--‐LeSc. Este score poderá vir a ser uma ferramenta útil para quantificação da carga aterosclerótica coronária, sendo de esperar que possa traduzir a informação prognóstica da angio TC cardíaca. Por fim, o conceito de árvore coronária vulnerável poderá ser mais importante do que o da placa vulnerável e a sua identificação pela angio TC cardíaca poderá ser importante numa estratégia de prevenção mais avançada. Esta poderá permitir personalizar as medidas de prevenção primária, doseando melhor a sua intensidade em função da carga aterosclerótica, podendo esta vir a constituir uma das mais importantes indicações da angio TC cardíaca no futuro.---------------- ABSTRACT Despite the significant advances made possible in recent years in the field of pharmacology and diagnostic tests, acute yocardial infarction and sudden cardiac death remain the first manifestation of coronary atherosclerosis in a significant proportion of patients, as many were previously asymptomatic. Traditionally, the diagnostic exams employed for the evaluation of possible coronary artery disease are based on the documentation of myocardial ischemia and, in this way, they are linked to the presence of obstructive coronary stenosis. Nonobstructive coronary lesions are also frequently involved in the development of coronary events. Although the absolute risk of becoming unstable per plaque is higher for more obstructive and higher burden plaques, these are much less frequent than nonobstructive lesions and therefore, in terms of probability for the patient, coronary events are often the result of rupture or erosion of the latter ones. Recent advanced intracoronary imaging studies provided evidence that although it is possible to identify some features of vulnerability in plaques associated with subsequente development of coronary events, the sensitivity and sensibility are very limited for clinical application. More important than the individual risk associated with a certain plaque, for the patient it might be more important the global risk of the total coronary tree, as reflected by the sum of the diferent probabilities of all the lesions, since the higher the coronary Atherosclerotic burden, the higher the risk for the patient. Cardiac CT or Coronary CT angiography is still a young modality. It is the most recente noninvasive imaging modality in the study of coronary artery disease and its development was possible due to important advances in multidetector CT technology. These allowed significant improvements in temporal and spatial resolution, leading to better image quality and also some impressive reductions in radiation dose. At the same time, the increasing experience with this technique lead to a growing body of scientific evidence, making cardiac CT a robust imaging tool for the evaluation of coronary artery disease and increased its clinical indications. More recently, several publications documented its prognostic value, marking the transition of cardiac CT to adulthood. Besides being able to exclude the presence of coronary artery disease and of obstructive lesions, Cardiac CT allows also the identification of nonobstructive lesions, making this a unique tool in the field of noninvasive imaging modalities. By evaluating both obstructive and nonobstructive lesions, cardiac CT can provide for the quantification of total coronary atherosclerotic burden, and this can be useful to stratify the risk of future coronary events. In the present work, it was possible to identify significant demographic and clinical predictors of a high coronary atherosclerotic burden as assessed by cardiac CT, but with modest odds ratios, even when the individual variables were gathered in clinical scores. Among these diferent clinical scores, the performance was better for the Heartscore, a cardiovascular risk score. This modest performance underline the limitations on predicting the presence and severity of coronary disease based only on clinical variables, even when optimized together in risk scores, One of the classical risk factors, obesity, had in fact a paradoxical relation with coronary atherosclerotic burden and might explain some of the limitations of the clinical models. On the opposite, diabetes mellitus was one of the strongest clinical predictors, and was considered to be a model of more advanced coronary disease, useful to evaluate the performance of diferent plaque burden scores. In face of the high prevalence of plaques that can be identified in the coronary tree of patients undergoing cardiac CT, it is of utmost importance to develop tools to quantify the total coronary atherosclerotic burden providing the identification of patients that could eventually benefit from more intensive preventive measures. This was the rational for the development of a coronary atherosclerotic burden score, reflecting the comprehensive information on localization, degree of stenosis and plaque composition provided by cardiac CT – the CT-LeSc. This score may become a useful tool to quantify total coronary atherosclerotic burden and is expected to convey the strong prognostic information of cardiac CT. Lastly, the concept of vulnerable coronary tree might become more important than the concept of the vulnerable plaque and his assessment by cardiac CT Might become important in a more advance primary prevention strategy. This Could lead to a more custom-made primary prevention, tailoring the intensity of preventive measures to the atherosclerotic burden and this might become one of the most important indications of cardiac CT In the near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theoretical epidemiology aims to understand the dynamics of diseases in populations and communities. Biological and behavioral processes are abstracted into mathematical formulations which aim to reproduce epidemiological observations. In this thesis a new system for the self-reporting of syndromic data — Influenzanet — is introduced and assessed. The system is currently being extended to address greater challenges of monitoring the health and well-being of tropical communities.(...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Amyotrophic Lateral Sclerosis (ALS) is the most severe and common adult onset disorder that affects motor neurons in the spinal cord, brainstem and cortex, resulting in progressive weakness and death from respiratory failure within two to five years of symptoms onset(...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, a significant increase on the demand for interoperable systems for exchanging data in business collaborative environments has been noticed. Consequently, cooperation agreements between each of the involved enterprises have been brought to light. However, due to the fact that even in a same community or domain, there is a big variety of knowledge representation not semantically coincident, which embodies the existence of interoperability problems in the enterprises information systems that need to be addressed. Moreover, in relation to this, most organizations face other problems about their information systems, as: 1) domain knowledge not being easily accessible by all the stakeholders (even intra-enterprise); 2) domain knowledge not being represented in a standard format; 3) and even if it is available in a standard format, it is not supported by semantic annotations or described using a common and understandable lexicon. This dissertation proposes an approach for the establishment of an enterprise reference lexicon from business models. It addresses the automation in the information models mapping for the reference lexicon construction. It aggregates a formal and conceptual representation of the business domain, with a clear definition of the used lexicon to facilitate an overall understanding by all the involved stakeholders, including non-IT personnel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computational power is increasing day by day. Despite that, there are some tasks that are still difficult or even impossible for a computer to perform. For example, while identifying a facial expression is easy for a human, for a computer it is an area in development. To tackle this and similar issues, crowdsourcing has grown as a way to use human computation in a large scale. Crowdsourcing is a novel approach to collect labels in a fast and cheap manner, by sourcing the labels from the crowds. However, these labels lack reliability since annotators are not guaranteed to have any expertise in the field. This fact has led to a new research area where we must create or adapt annotation models to handle these weaklylabeled data. Current techniques explore the annotators’ expertise and the task difficulty as variables that influences labels’ correction. Other specific aspects are also considered by noisy-labels analysis techniques. The main contribution of this thesis is the process to collect reliable crowdsourcing labels for a facial expressions dataset. This process consists in two steps: first, we design our crowdsourcing tasks to collect annotators labels; next, we infer the true label from the collected labels by applying state-of-art crowdsourcing algorithms. At the same time, a facial expression dataset is created, containing 40.000 images and respective labels. At the end, we publish the resulting dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Malaria is a serious problem in the Brazilian Amazon region, and the detection of possible risk factors could be of great interest for public health authorities. The objective of this article was to investigate the association between environmental variables and the yearly registers of malaria in the Amazon region using Bayesian spatiotemporal methods. METHODS: We used Poisson spatiotemporal regression models to analyze the Brazilian Amazon forest malaria count for the period from 1999 to 2008. In this study, we included some covariates that could be important in the yearly prediction of malaria, such as deforestation rate. We obtained the inferences using a Bayesian approach and Markov Chain Monte Carlo (MCMC) methods to simulate samples for the joint posterior distribution of interest. The discrimination of different models was also discussed. RESULTS: The model proposed here suggests that deforestation rate, the number of inhabitants per km², and the human development index (HDI) are important in the prediction of malaria cases. CONCLUSIONS: It is possible to conclude that human development, population growth, deforestation, and their associated ecological alterations are conducive to increasing malaria risk. We conclude that the use of Poisson regression models that capture the spatial and temporal effects under the Bayesian paradigm is a good strategy for modeling malaria counts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The authors report the case of a female infant with Group III (or Grade III) megaesophagus secondary to vector-borne Chagas disease, resulting in severe malnutrition that reversed after surgery (Heller technique). The infant was then treated with the antiparasitic drug benznidazole, and the infection was cured, as demonstrated serologically and parasitologically. After follow-up of several years without evidence of disease, with satisfactory weight and height development, the patient had her first child at age 23, in whom serological tests for Chagas disease yielded negative results. Thirty years after the initial examination, the patient's electrocardiogram, echocardiogram, and chest radiography remained normal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the boundaries of simplified wind turbine models used to represent the behavior of wind turbines in order to conduct power system stability studies. Based on experimental measurements, the response of recent simplified (also known as generic) wind turbine models that are currently being developed by the International Standard IEC 61400-27 is compared to complex detailed models elaborated by wind turbine manufacturers. This International Standard, whose Technical Committee was convened in October 2009, is focused on defining generic simulation models for both wind turbines (Part 1) and wind farms (Part 2). The results of this work provide an improved understanding of the usability of generic models for conducting power system simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research is titled “The Future of Airline Business Models: Which Will Win?” and it is part of the requirements for the award of a Masters in Management from NOVA BSE and another from Luiss Guido Carlo University. The purpose is to elaborate a complete market analysis of the European Air Transportation Industry in order to predict which Airlines, strategies and business models may be successful in the next years. First, an extensive literature review of the business model concept has been done. Then, a detailed overview of the main European Airlines and the strategies that they have been implementing so far has been developed. Finally, the research is illustrated with three case studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to a recent Eurobarometer survey (2014), 68% of Europeans tend not to trust national governments. As the increasing alienation of citizens from politics endangers democracy and welfare, governments, practitioners and researchers look for innovative means to engage citizens in policy matters. One of the measures intended to overcome the so-called democratic deficit is the promotion of civic participation. Digital media proliferation offers a set of novel characteristics related to interactivity, ubiquitous connectivity, social networking and inclusiveness that enable new forms of societal-wide collaboration with a potential impact on leveraging participative democracy. Following this trend, e-Participation is an emerging research area that consists in the use of Information and Communication Technologies to mediate and transform the relations among citizens and governments towards increasing citizens’ participation in public decision-making. However, despite the widespread efforts to implement e-Participation through research programs, new technologies and projects, exhaustive studies on the achieved outcomes reveal that it has not yet been successfully incorporated in institutional politics. Given the problems underlying e-Participation implementation, the present research suggested that, rather than project-oriented efforts, the cornerstone for successfully implementing e-Participation in public institutions as a sustainable added-value activity is a systematic organisational planning, embodying the principles of open-governance and open-engagement. It further suggested that BPM, as a management discipline, can act as a catalyst to enable the desired transformations towards value creation throughout the policy-making cycle, including political, organisational and, ultimately, citizen value. Following these findings, the primary objective of this research was to provide an instrumental model to foster e-Participation sustainability across Government and Public Administration towards a participatory, inclusive, collaborative and deliberative democracy. The developed artefact, consisting in an e-Participation Organisational Semantic Model (ePOSM) underpinned by a BPM-steered approach, introduces this vision. This approach to e-Participation was modelled through a semi-formal lightweight ontology stack structured in four sub-ontologies, namely e-Participation Strategy, Organisational Units, Functions and Roles. The ePOSM facilitates e-Participation sustainability by: (1) Promoting a common and cross-functional understanding of the concepts underlying e-Participation implementation and of their articulation that bridges the gap between technical and non-technical users; (2) Providing an organisational model which allows a centralised and consistent roll-out of strategy-driven e-Participation initiatives, supported by operational units dedicated to the execution of transformation projects and participatory processes; (3) Providing a standardised organisational structure, goals, functions and roles related to e-Participation processes that enhances process-level interoperability among government agencies; (4) Providing a representation usable in software development for business processes’ automation, which allows advanced querying using a reasoner or inference engine to retrieve concrete and specific information about the e-Participation processes in place. An evaluation of the achieved outcomes, as well a comparative analysis with existent models, suggested that this innovative approach tackling the organisational planning dimension can constitute a stepping stone to harness e-Participation value.