951 resultados para Significance-driven computing
Resumo:
The interaction of ionising radiation with living tissues may direct or indirectly generate several secondary species with relevant genotoxic potential. Due to recent findings that electrons with energies below the ionisation threshold can effectively damage DNA, radiation-induced damage to biological systems has increasingly come under scrutiny. The exact physico-chemical processes that occur in the first stages of electron induced damage remain to be explained. However, it is also known that free electrons have a short lifetime in the physiological medium. Hence, electron transfer processes studies represent an alternative approach through which the role of "bound" electrons as a source of damage to biological tissues can be further explored. The thesis work consists of studying dissociative electron attachment (DEA) and electron transfer to taurine and thiaproline. DEA measurements were executed in Siedlce University with Prof. Janina Kopyra under COST action MP1002 (Nanoscale insights in ion beam cancer therapy). The electron transfer experiments were conducted in a crossed atom(potassium)-molecule beam arrangement. In these studies the anionic fragmentation patterns were obtained. The results of both mechanisms are shown to be significantly different, unveiling that the damaging potential of secondary electrons can be underestimated. In addition, sulphur atoms appear to strongly influence the dissociation process, demonstrating that certain reactions can be controlled by substitution of sulphur at specific molecular sites.
Resumo:
The obligate intracellular bacterium Chlamydia trachomatis is a human pathogen of major public health significance. Strains can be classified into 15 main serovars (A to L3) that preferentially cause ocular infections (A-C), genital infections (D-K) or lymphogranuloma venereum (LGV) (L1-L3), but the molecular basis behind their distinct tropism, ecological success and pathogenicity is not welldefined. Most chlamydial research demands culture in eukaryotic cell lines, but it is not known if stains become laboratory adapted. By essentially using genomics and transcriptomics, we aimed to investigate the evolutionary patterns underlying the adaptation of C. trachomatis to the different human tissues, given emphasis to the identification of molecular patterns of genes encoding hypothetical proteins, and to understand the adaptive process behind the C. trachomatis in vivo to in vitro transition. Our results highlight a positive selection-driven evolution of C. trachomatis towards nichespecific adaptation, essentially targeting host-interacting proteins, namely effectors and inclusion membrane proteins, where some of them also displayed niche-specific expression patterns. We also identified potential "ocular-specific" pseudogenes, and pointed out the major gene targets of adaptive mutations associated with LGV infections. We further observed that the in vivo-derived genetic makeup of C. trachomatis is not significantly compromised by its long-term laboratory propagation. In opposition, its introduction in vitro has the potential to affect the phenotype, likely yielding virulence attenuation. In fact, we observed a "genital-specific" rampant inactivation of the virulence gene CT135, which may impact the interpretation of data derived from studies requiring culture. Globally, the findings presented in this Ph.D. thesis contribute for the understanding of C.trachomatis adaptive evolution and provides new insights into the biological role of C. trachomatishypothetical proteins. They also launch research questions for future functional studies aiming toclarify the determinants of tissue tropism, virulence or pathogenic dissimilarities among C. trachomatisstrains.
Resumo:
According to a recent Eurobarometer survey (2014), 68% of Europeans tend not to trust national governments. As the increasing alienation of citizens from politics endangers democracy and welfare, governments, practitioners and researchers look for innovative means to engage citizens in policy matters. One of the measures intended to overcome the so-called democratic deficit is the promotion of civic participation. Digital media proliferation offers a set of novel characteristics related to interactivity, ubiquitous connectivity, social networking and inclusiveness that enable new forms of societal-wide collaboration with a potential impact on leveraging participative democracy. Following this trend, e-Participation is an emerging research area that consists in the use of Information and Communication Technologies to mediate and transform the relations among citizens and governments towards increasing citizens’ participation in public decision-making. However, despite the widespread efforts to implement e-Participation through research programs, new technologies and projects, exhaustive studies on the achieved outcomes reveal that it has not yet been successfully incorporated in institutional politics. Given the problems underlying e-Participation implementation, the present research suggested that, rather than project-oriented efforts, the cornerstone for successfully implementing e-Participation in public institutions as a sustainable added-value activity is a systematic organisational planning, embodying the principles of open-governance and open-engagement. It further suggested that BPM, as a management discipline, can act as a catalyst to enable the desired transformations towards value creation throughout the policy-making cycle, including political, organisational and, ultimately, citizen value. Following these findings, the primary objective of this research was to provide an instrumental model to foster e-Participation sustainability across Government and Public Administration towards a participatory, inclusive, collaborative and deliberative democracy. The developed artefact, consisting in an e-Participation Organisational Semantic Model (ePOSM) underpinned by a BPM-steered approach, introduces this vision. This approach to e-Participation was modelled through a semi-formal lightweight ontology stack structured in four sub-ontologies, namely e-Participation Strategy, Organisational Units, Functions and Roles. The ePOSM facilitates e-Participation sustainability by: (1) Promoting a common and cross-functional understanding of the concepts underlying e-Participation implementation and of their articulation that bridges the gap between technical and non-technical users; (2) Providing an organisational model which allows a centralised and consistent roll-out of strategy-driven e-Participation initiatives, supported by operational units dedicated to the execution of transformation projects and participatory processes; (3) Providing a standardised organisational structure, goals, functions and roles related to e-Participation processes that enhances process-level interoperability among government agencies; (4) Providing a representation usable in software development for business processes’ automation, which allows advanced querying using a reasoner or inference engine to retrieve concrete and specific information about the e-Participation processes in place. An evaluation of the achieved outcomes, as well a comparative analysis with existent models, suggested that this innovative approach tackling the organisational planning dimension can constitute a stepping stone to harness e-Participation value.
Resumo:
In the following text I will develop three major aspects. The first is to draw attention to those who seem to have been the disciplinary fields where, despite everything, the Digital Humanities (in the broad perspective as will be regarded here) have asserted themselves in a more comprehensive manner. I think it is here that I run into greater risks, not only for what I have mentioned above, but certainly because a significant part, perhaps, of the achievements and of the researchers might have escaped the look that I sought to cast upon the past few decades, always influenced by my own experience and the work carried out in the field of History. But this can be considered as a work in progress and it is open to criticism and suggestions. A second point to note is that emphasis will be given to the main lines of development in the relationship between historical research and digital methodologies, resources and tools. Finally, I will try to make a brief analysis of what has been the Digital Humanities discourse appropriation in recent years, with very debatable data and methods for sure, because studies are still scarce and little systematic information is available that would allow to go beyond an introductory reflection.
Resumo:
Recent studies have described widespread statigraphic units of Late Pleistocene and Holocene age in the western part of the Amazon Basin. The recognition of deltaic sedimentation in the uppermost these units near Rio Branco, Brazil, at a modern elevation of approximately 500 feett leads to the conclusion that this area was situated on the edge of a large Amazonian lake that existed in the recent past when Andean tectonism caused active downwarping of the western edge of the Amazon Basin. The ramifications of this "Lago Anazonas" hypothesis extend into every area of modern Amazonian geology and biology.
Resumo:
The year is 2015 and the startup and tech business ecosphere has never seen more activity. In New York City alone, the tech startup industry is on track to amass $8 billion dollars in total funding – the highest in 7 years (CB Insights, 2015). According to the Kauffman Index of Entrepreneurship (2015), this figure represents just 20% of the total funding in the United States. Thanks to platforms that link entrepreneurs with investors, there are simply more funding opportunities than ever, and funding can be initiated in a variety of ways (angel investors, venture capital firms, crowdfunding). And yet, in spite of all this, according to Forbes Magazine (2015), nine of ten startups will fail. Because of the unpredictable nature of the modern tech industry, it is difficult to pinpoint exactly why 90% of startups fail – but the general consensus amongst top tech executives is that “startups make products that no one wants” (Fortune, 2014). In 2011, author Eric Ries wrote a book called The Lean Startup in attempts to solve this all-too-familiar problem. It was in this book where he developed the framework for The Hypothesis-Driven Entrepreneurship Process, an iterative process that aims at proving a market before actually launching a product. Ries discusses concepts such as the Minimum Variable Product, the smallest set of activities necessary to disprove a hypothesis (or business model characteristic). Ries encourages acting briefly and often: if you are to fail, then fail fast. In today’s fast-moving economy, an entrepreneur cannot afford to waste his own time, nor his customer’s time. The purpose of this thesis is to conduct an in-depth of analysis of Hypothesis-Driven Entrepreneurship Process, in order to test market viability of a reallife startup idea, ShowMeAround. This analysis will follow the scientific Lean Startup approach; for the purpose of developing a functional business model and business plan. The objective is to conclude with an investment-ready startup idea, backed by rigorous entrepreneurial study.
Resumo:
As huge amounts of data become available in organizations and society, specific data analytics skills and techniques are needed to explore this data and extract from it useful patterns, tendencies, models or other useful knowledge, which could be used to support the decision-making process, to define new strategies or to understand what is happening in a specific field. Only with a deep understanding of a phenomenon it is possible to fight it. In this paper, a data-driven analytics approach is used for the analysis of the increasing incidence of fatalities by pneumonia in the Portuguese population, characterizing the disease and its incidence in terms of fatalities, knowledge that can be used to define appropriate strategies that can aim to reduce this phenomenon, which has increased more than 65% in a decade.
Resumo:
Customer lifetime value (LTV) enables using client characteristics, such as recency, frequency and monetary (RFM) value, to describe the value of a client through time in terms of profitability. We present the concept of LTV applied to telemarketing for improving the return-on-investment, using a recent (from 2008 to 2013) and real case study of bank campaigns to sell long- term deposits. The goal was to benefit from past contacts history to extract additional knowledge. A total of twelve LTV input variables were tested, un- der a forward selection method and using a realistic rolling windows scheme, highlighting the validity of five new LTV features. The results achieved by our LTV data-driven approach using neural networks allowed an improvement up to 4 pp in the Lift cumulative curve for targeting the deposit subscribers when compared with a baseline model (with no history data). Explanatory knowledge was also extracted from the proposed model, revealing two highly relevant LTV features, the last result of the previous campaign to sell the same product and the frequency of past client successes. The obtained results are particularly valuable for contact center companies, which can improve pre- dictive performance without even having to ask for more information to the companies they serve.
Resumo:
Human activity is very dynamic and subtle, and most physical environments are also highly dynamic and support a vast range of social practices that do not map directly into any immediate ubiquitous computing functionally. Identifying what is valuable to people is very hard and obviously leads to great uncertainty regarding the type of support needed and the type of resources needed to create such support. We have addressed the issues of system development through the adoption of a Crowdsourced software development model [13]. We have designed and developed Anywhere places, an open and flexible system support infrastructure for Ubiquitous Computing that is based on a balanced combination between global services and applications and situated devices. Evaluation, however, is still an open problem. The characteristics of ubiquitous computing environments make their evaluation very complex: there are no globally accepted metrics and it is very difficult to evaluate large-scale and long-term environments in real contexts. In this paper, we describe a first proposal of an hybrid 3D simulated prototype of Anywhere places that combines simulated and real components to generate a mixed reality which can be used to assess the envisaged ubiquitous computing environments [17].
Resumo:
This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.
Resumo:
The present paper focuses on a damage identification method based on the use of the second order spectral properties of the nodal response processes. The explicit dependence on the frequency content of the outputs power spectral densities makes them suitable for damage detection and localization. The well-known case study of the Z24 Bridge in Switzerland is chosen to apply and further investigate this technique with the aim of validating its reliability. Numerical simulations of the dynamic response of the structure subjected to different types of excitation are carried out to assess the variability of the spectrum-driven method with respect to both type and position of the excitation sources. The simulated data obtained from random vibrations, impulse, ramp and shaking forces, allowed to build the power spectrum matrix from which the main eigenparameters of reference and damage scenarios are extracted. Afterwards, complex eigenvectors and real eigenvalues are properly weighed and combined and a damage index based on the difference between spectral modes is computed to pinpoint the damage. Finally, a group of vibration-based damage identification methods are selected from the literature to compare the results obtained and to evaluate the performance of the spectral index.
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto.
Resumo:
Films of BaFe12O19/P(VDF-TrFE) composites with 5, 10 and 20 %wt Barium ferrite content have been fabricated. BaFe12O19 microparticles have the shape of thin hexagonal platelets, the easy direction of magnetization remaining along the c axis, which is perpendicular to the plates. This fact allows for ferrite particles orientation in-plane and out-of-plane within the composite films, as confirmed by measured hysteresis loops. While the in-plane induced magnetoelectric effect (ME) is practically zero, these composite films show a good out-of-plane magnetoelectric effect. with maximum ME coupling coefficient changes of 3, 17 and 2 mV/cm.Oe for the 5, 10 and 20%wt Barium ferrite content films, respectively. We infer that this ME behavior appears as driven by the magnetization process arising when we applied the external magnetic field. We have also measured linear and reversible magnetoelectric effect for low applied bias field, when magnetization process is still reversible.
Resumo:
Kidney renal failure means that one’s kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis.The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.