929 resultados para Based structure model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia do Ambiente, perfil de Engenharia Ecológica

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presented at SEMINAR "ACTION TEMPS RÉEL:INFRASTRUCTURES ET SERVICES SYSTÉMES". 10, Apr, 2015. Brussels, Belgium.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Youth unemployment is one of the most pressing social issues in Portugal, often associated to a lack of skills. Faz-Te Forward (FFWD), a Portuguese employability programme, has demonstrated great potential for impact in solving this issue, especially amongst a neglected segment of the population – those belonging to “sandwich families”. The present thesis, integrated in the SIB Research Programme from the Social Investment Lab, evaluates the feasibility of this programme to be financed through a Social Impact Bond, an innovative outcomes-based financing model. From a data analysis undertaken to FFWD’s historical information, a business case for a SIB was developed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work, novel auxetic structure has been developed from braided composites produced from basalt fiber. The paper reported the auxetic and tensile behavior of the structures produced from basalt fiber and also compared with structures developed from braided composites having glass fiber as core. The basic design is modified with straight rod to improve the strengthening behavior of structure with structural elements. The Poisson’s ratio of the modified structure are studied. The Poisson’s ratio of the structure made from basalt and glass reinforced BCRs are almost similar but the tensile behavior of basalt based structure is good than glass fiber.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Starting from the observation that ghosts are strikingly recurrent and prominent figures in late-twentieth African diasporic literature, this dissertation proposes to account for this presence by exploring its various functions. It argues that, beyond the poetic function the ghost performs as metaphor, it also does cultural, theoretical and political work that is significant to the African diaspora in its dealings with issues of history, memory and identity. Toni Morrison's Beloved (1987) serves as a guide for introducing the many forms, qualities and significations of the ghost, which are then explored and analyzed in four chapters that look at Fred D'Aguiar's Feeding the Ghosts (1998), Gloria Naylor's Mama Day (1988), Paule Marshall's Praisesong for the Widow (1983) and a selection of novels, short stories and poetry by Michelle Cliff. Moving thematically through these texts, the discussion shifts from history through memory to identity as it examines how the ghost trope allows the writers to revisit sites of trauma; revise historical narratives that are constituted and perpetuated by exclusions and invisibilities; creatively and critically repossess a past marked by violence, dislocation and alienation and reclaim the diasporic culture it contributed to shaping; destabilize and deconstruct the hegemonic, normative categories and boundaries that delimit race or sexuality and envision other, less limited and limiting definitions of identity. These diverse and interrelated concerns are identified and theorized as participating in a project of "re-vision," a critical project that constitutes an epistemological as much as a political gesture. The author-based structure allows for a detailed analysis of the texts and highlights the distinctive shapes the ghost takes and the particular concerns it serves to address in each writer's literary and political project. However, using the ghost as a guide into these texts, taken collectively, also throws into relief new connections between them and sheds light on the complex ways in which the interplay of history, memory and identity positions them as products of and contributions to an African diasporic (literary) culture. If it insists on the cultural specificity of African diasporic ghosts, tracing its origins to African cultures and spiritualities, the argument also follows gothic studies' common view that ghosts in literary and cultural productions-like other related figures of the living dead-respond to particular conditions and anxieties. Considering the historical and political context in which the texts under study were produced, the dissertation makes connections between the ghosts in them and African diasporic people's disillusionment with the broken promises of the civil rights movement in the United States and of postcolonial independence in the Caribbean. It reads the texts' theoretical concerns and narrative qualities alongside the contestation of traditional historiography by black and postcolonial studies as well as the broader challenge to conventional notions such as truth, reality, meaning, power or identity by poststructuralism, postcolonialism or queer theory. Drawing on these various theoretical approaches and critical tools to elucidate the ghost's deconstructive power for African diasporic writers' concerns, this work ultimately offers a contribution to "speciality studies," which is currently emerging as a new field of scholarship in cultural theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Static incubation tests, where microcapsules and beads are contacted with polymer and protein solutions, have been developed for the characterization of permselective materials applied for bioartificial organs and drug delivery. A combination of polymer ingress, detected by size-exclusion chromatography, and protein ingress/ egress, assessed by gel electrophoresis, provides information regarding the diffusion kinetics, molar mass cutoff(MMCO) and permeability. This represents an improvement over existing permeability measurements that are based on the diffusion of a single type of solute. Specifically, the permeability of capsules based on alginate, cellulose sulfate, polymethylene-co-guanidine were characterized as a function of membrane thickness. Solid alginate beads were also evaluated. The MMCO of these capsules was estimated to be between 80 and 90 kDa using polymers, and between 116-150 kDa with proteins. Apparently, the globular shape of the proteins (radius of gyration (Rg) of 4.2-4.6 nm) facilitates their passage through the membrane, comparatively to the polysaccharide coil conformation (Rg of 6.5-8.3 nm). An increase of the capsule membrane thickness reduced these values. The MMCO of the beads, which do not have a membrane limiting their permselective properties, was higher, between 110 and 200 kDa with dextrans, and between 150 and 220 kDa with proteins. Therefore, although the permeability estimated with biologically relevant molecules is generally higher due to their lower radius of gyration, both the MMCO of synthetic and natural watersoluble polymers correlate well, and can be used as in vitro metrics for the immune protection ability of microcapsules and microbeads. This article shows, to the authors' knowledge, the first reported concordance between permeability measures based on model natural and biological macromolecules.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper reviews three different approaches to modelling the cost-effectiveness of schistosomiasis control. Although these approaches vary in their assessment of costs, the major focus of the paper is on the evaluation of effectiveness. The first model presented is a static economic model which assesses effectiveness in terms of the proportion of cases cured. This model is important in highlighting that the optimal choice of chemotherapy regime depends critically on the level of budget constraint, the unit costs of screening and treatment, the rates of compliance with screening and chemotherapy and the prevalence of infection. The limitations of this approach is that it models the cost-effectiveness of only one cycle of treatment, and effectiveness reflects only the immediate impact of treatment. The second model presented is a prevalence-based dynamic model which links prevalence rates from one year to the next, and assesses effectiveness as the proportion of cases prevented. This model was important as it introduced the concept of measuring the long-term impact of control by using a transmission model which can assess reduction in infection through time, but is limited to assessing the impact only on the prevalence of infection. The third approach presented is a theoretical framework which describes the dynamic relationships between infection and morbidity, and which assesses effectiveness in terms of case-years prevented of infection and morbidity. The use of this model in assessing the cost-effectiveness of age-targeted treatment in controlling Schistosoma mansoni is explored in detail, with respect to varying frequencies of treatment and the interaction between drug price and drug efficacy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Models predicting species spatial distribution are increasingly applied to wildlife management issues, emphasising the need for reliable methods to evaluate the accuracy of their predictions. As many available datasets (e.g. museums, herbariums, atlas) do not provide reliable information about species absences, several presence-only based analyses have been developed. However, methods to evaluate the accuracy of their predictions are few and have never been validated. The aim of this paper is to compare existing and new presenceonly evaluators to usual presence/absence measures. We use a reliable, diverse, presence/absence dataset of 114 plant species to test how common presence/absence indices (Kappa, MaxKappa, AUC, adjusted D-2) compare to presenceonly measures (AVI, CVI, Boyce index) for evaluating generalised linear models (GLM). Moreover we propose a new, threshold-independent evaluator, which we call "continuous Boyce index". All indices were implemented in the B10MAPPER software. We show that the presence-only evaluators are fairly correlated (p > 0.7) to the presence/absence ones. The Boyce indices are closer to AUC than to MaxKappa and are fairly insensitive to species prevalence. In addition, the Boyce indices provide predicted-toexpected ratio curves that offer further insights into the model quality: robustness, habitat suitability resolution and deviation from randomness. This information helps reclassifying predicted maps into meaningful habitat suitability classes. The continuous Boyce index is thus both a complement to usual evaluation of presence/absence models and a reliable measure of presence-only based predictions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BRAF V600E is an emerging drug target in lung cancer, but the clinical significance of non-V600 BRAF mutations in lung cancer and other malignancies is less clear. Here, we report the case of a patient with metastatic lung adenocarcinoma with BRAF G469L mutation refractory to vemurafenib. We calculated a structure model of this very rare type of mutated BRAF kinase to explain the molecular mechanism of drug resistance. This information may help to develop effective targeted therapies for cancers with non-V600 BRAF mutations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The 20 amino acid residue peptides derived from RecA loop L2 have been shown to be the pairing domain of RecA. The peptides bind to ss- and dsDNA, unstack ssDNA, and pair the ssDNA to its homologous target in a duplex DNA. As shown by circular dichroism, upon binding to DNA the disordered peptides adopt a beta-structure conformation. Here we show that the conformational change of the peptide from random coil to beta-structure is important in binding ss- and dsDNA. The beta-structure in the DNA pairing peptides can be induced by many environmental conditions such as high pH, high concentration, and non-micellar sodium dodecyl sulfate (6 mM). This behavior indicates an intrinsic property of these peptides to form a beta-structure. A beta-structure model for the loop L2 of RecA protein when bound to DNA is thus proposed. The fact that aromatic residues at the central position 203 strongly modulate the peptide binding to DNA and subsequent biochemical activities can be accounted for by the direct effect of the aromatic amino acids on the peptide conformational change. The DNA-pairing domain of RecA visualized by electron microscopy self-assembles into a filamentous structure like RecA. The relevance of such a peptide filamentous structure to the structure of RecA when bound to DNA is discussed.