935 resultados para restriction of parameter space


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the influence of pore space characteristics on the hydraulic conductivity and spectral induced polarization (SIP) response is critical for establishing relationships between the electrical and hydrological properties of surficial sedimentary deposits. Here, we present the results of laboratory SIP measurements on saturated quartz samples with granulometric characteristics ranging from fine sand to fine gravel. We alter the pore characteristics using three principal methods: (i) variation of the grain sizes, (ii) changing the degree of compaction, and (iii) changing the level of sorting. We then examine how these changes affect both the SIP response and the hydraulic conductivity. In general, the results indicate a clear connection between the applied changes in pore characteristics and the SIP response. In particular, we observe a systematic correlation between the hydraulic conductivity and the relaxation time of the Cole-Cole model describing the observed SIP effect for the whole range of considered grain sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La gouvernance de l'Internet est une thématique récente dans la politique mondiale. Néanmoins, elle est devenue au fil des années un enjeu économique et politique important. La question a même pris une importance particulière au cours des derniers mois en devenant un sujet d'actualité récurrent. Forte de ce constat, c ette recherche retrace l'histoire de la gouvernance de l'Internet depuis son émergence comme enjeu politique dans les années 1980 jusqu'à la fin du Sommet Mondial sur la Société de l'Information (SMSI) en 2005. Plutôt que de se focaliser sur l'une ou l'autre des institutions impliquées dans la régulation du réseau informatique mondial, cette recherche analyse l'émergence et l'évolution historique d'un espace de luttes rassemblant un nombre croissant d'acteurs différents. Cette évolution est décrite à travers le prisme de la relation dialectique entre élites et non-élites et de la lutte autour de la définition de la gouvernance de l'Internet. Cette thèse explore donc la question de comment les relations au sein des élites de la gouvernance de l'Internet et entre ces élites et les non-élites expliquent l'emergence, l'évolution et la structuration d'un champ relativement autonome de la politique mondiale centré sur la gouvernance de l'Internet. Contre les perspectives dominantes réaliste et libérales, cette recherche s'ancre dans une approche issue de la combinaison des traditions hétérodoxes en économie politique internationale et des apports de la sociologie politique internationale. Celle-ci s'articule autour des concepts de champ, d'élites et d'hégémonie. Le concept de champ, développé par Bourdieu inspire un nombre croissant d'études de la politique mondiale. Il permet à la fois une étude différenciée de la mondialisation et l'émergence d'espaces de lutte et de domination au niveau transnational. La sociologie des élites, elle, permet une approche pragmatique et centrée sur les acteurs des questions de pouvoir dans la mondialisation. Cette recherche utilise plus particulièrement le concept d'élite du pouvoir de Wright Mills pour étudier l'unification d'élites a priori différentes autour de projets communs. Enfin, cette étude reprend le concept néo-gramscien d'hégémonie afin d'étudier à la fois la stabilité relative du pouvoir d'une élite garantie par la dimension consensuelle de la domination, et les germes de changement contenus dans tout ordre international. A travers l'étude des documents produits au cours de la période étudiée et en s'appuyant sur la création de bases de données sur les réseaux d'acteurs, cette étude s'intéresse aux débats qui ont suivi la commercialisation du réseau au début des années 1990 et aux négociations lors du SMSI. La première période a abouti à la création de l'Internet Corporation for Assigned Names and Numbers (ICANN) en 1998. Cette création est le résultat de la recherche d'un consensus entre les discours dominants des années 1990. C'est également le fruit d'une coalition entre intérêts au sein d'une élite du pouvoir de la gouvernance de l'Internet. Cependant, cette institutionnalisation de l'Internet autour de l'ICANN excluait un certain nombre d'acteurs et de discours qui ont depuis tenté de renverser cet ordre. Le SMSI a été le cadre de la remise en cause du mode de gouvernance de l'Internet par les États exclus du système, des universitaires et certaines ONG et organisations internationales. C'est pourquoi le SMSI constitue la seconde période historique étudiée dans cette thèse. La confrontation lors du SMSI a donné lieu à une reconfiguration de l'élite du pouvoir de la gouvernance de l'Internet ainsi qu'à une redéfinition des frontières du champ. Un nouveau projet hégémonique a vu le jour autour d'éléments discursifs tels que le multipartenariat et autour d'insitutions telles que le Forum sur la Gouvernance de l'Internet. Le succès relatif de ce projet a permis une stabilité insitutionnelle inédite depuis la fin du SMSI et une acceptation du discours des élites par un grand nombre d'acteurs du champ. Ce n'est que récemment que cet ordre a été remis en cause par les pouvoirs émergents dans la gouvernance de l'Internet. Cette thèse cherche à contribuer au débat scientifique sur trois plans. Sur le plan théorique, elle contribue à l'essor d'un dialogue entre approches d'économie politique mondiale et de sociologie politique internationale afin d'étudier à la fois les dynamiques structurelles liées au processus de mondialisation et les pratiques localisées des acteurs dans un domaine précis. Elle insiste notamment sur l'apport de les notions de champ et d'élite du pouvoir et sur leur compatibilité avec les anlayses néo-gramsciennes de l'hégémonie. Sur le plan méthodologique, ce dialogue se traduit par une utilisation de méthodes sociologiques telles que l'anlyse de réseaux d'acteurs et de déclarations pour compléter l'analyse qualitative de documents. Enfin, sur le plan empirique, cette recherche offre une perspective originale sur la gouvernance de l'Internet en insistant sur sa dimension historique, en démontrant la fragilité du concept de gouvernance multipartenaire (multistakeholder) et en se focalisant sur les rapports de pouvoir et les liens entre gouvernance de l'Internet et mondialisation. - Internet governance is a recent issue in global politics. However, it gradually became a major political and economic issue. It recently became even more important and now appears regularly in the news. Against this background, this research outlines the history of Internet governance from its emergence as a political issue in the 1980s to the end of the World Summit on the Information Society (WSIS) in 2005. Rather than focusing on one or the other institution involved in Internet governance, this research analyses the emergence and historical evolution of a space of struggle affecting a growing number of different actors. This evolution is described through the analysis of the dialectical relation between elites and non-elites and through the struggle around the definition of Internet governance. The thesis explores the question of how the relations among the elites of Internet governance and between these elites and non-elites explain the emergence, the evolution, and the structuration of a relatively autonomous field of world politics centred around Internet governance. Against dominant realist and liberal perspectives, this research draws upon a cross-fertilisation of heterodox international political economy and international political sociology. This approach focuses on concepts such as field, elites and hegemony. The concept of field, as developed by Bourdieu, is increasingly used in International Relations to build a differentiated analysis of globalisation and to describe the emergence of transnational spaces of struggle and domination. Elite sociology allows for a pragmatic actor-centred analysis of the issue of power in the globalisation process. This research particularly draws on Wright Mill's concept of power elite in order to explore the unification of different elites around shared projects. Finally, this thesis uses the Neo-Gramscian concept of hegemony in order to study both the consensual dimension of domination and the prospect of change contained in any international order. Through the analysis of the documents produced within the analysed period, and through the creation of databases of networks of actors, this research focuses on the debates that followed the commercialisation of the Internet throughout the 1990s and during the WSIS. The first time period led to the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) in 1998. This creation resulted from the consensus-building between the dominant discourses of the time. It also resulted from the coalition of interests among an emerging power elite. However, this institutionalisation of Internet governance around the ICANN excluded a number of actors and discourses that resisted this mode of governance. The WSIS became the institutional framework within which the governance system was questioned by some excluded states, scholars, NGOs and intergovernmental organisations. The confrontation between the power elite and counter-elites during the WSIS triggered a reconfiguration of the power elite as well as a re-definition of the boundaries of the field. A new hegemonic project emerged around discursive elements such as the idea of multistakeholderism and institutional elements such as the Internet Governance Forum. The relative success of the hegemonic project allowed for a certain stability within the field and an acceptance by most non-elites of the new order. It is only recently that this order began to be questioned by the emerging powers of Internet governance. This research provides three main contributions to the scientific debate. On the theoretical level, it contributes to the emergence of a dialogue between International Political Economy and International Political Sociology perspectives in order to analyse both the structural trends of the globalisation process and the located practices of actors in a given issue-area. It notably stresses the contribution of concepts such as field and power elite and their compatibility with a Neo-Gramscian framework to analyse hegemony. On the methodological level, this perspective relies on the use of mixed methods, combining qualitative content analysis with social network analysis of actors and statements. Finally, on the empirical level, this research provides an original perspective on Internet governance. It stresses the historical dimension of current Internet governance arrangements. It also criticise the notion of multistakeholde ism and focuses instead on the power dynamics and the relation between Internet governance and globalisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Optimization methods allow designing changes in a system so that specific goals are attained. These techniques are fundamental for metabolic engineering. However, they are not directly applicable for investigating the evolution of metabolic adaptation to environmental changes. Although biological systems have evolved by natural selection and result in well-adapted systems, we can hardly expect that actual metabolic processes are at the theoretical optimum that could result from an optimization analysis. More likely, natural systems are to be found in a feasible region compatible with global physiological requirements. Results: We first present a new method for globally optimizing nonlinear models of metabolic pathways that are based on the Generalized Mass Action (GMA) representation. The optimization task is posed as a nonconvex nonlinear programming (NLP) problem that is solved by an outer- approximation algorithm. This method relies on solving iteratively reduced NLP slave subproblems and mixed-integer linear programming (MILP) master problems that provide valid upper and lower bounds, respectively, on the global solution to the original NLP. The capabilities of this method are illustrated through its application to the anaerobic fermentation pathway in Saccharomyces cerevisiae. We next introduce a method to identify the feasibility parametric regions that allow a system to meet a set of physiological constraints that can be represented in mathematical terms through algebraic equations. This technique is based on applying the outer-approximation based algorithm iteratively over a reduced search space in order to identify regions that contain feasible solutions to the problem and discard others in which no feasible solution exists. As an example, we characterize the feasible enzyme activity changes that are compatible with an appropriate adaptive response of yeast Saccharomyces cerevisiae to heat shock Conclusion: Our results show the utility of the suggested approach for investigating the evolution of adaptive responses to environmental changes. The proposed method can be used in other important applications such as the evaluation of parameter changes that are compatible with health and disease states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Surgical site infection (SSI) is a common cause of major morbidity after liver resection. This study aimed to identify the risk factors for incisional and organ/space SSIs after liver resection. METHODS: Our liver surgery database was retrospectively analyzed for patients treated between January 2009 and November 2012 in a tertiary care Swiss hospital. Univariate and multivariate analyses were conducted on preoperative, intraoperative, and postoperative variables to identify risk factors for incisional and organ/space SSIs. RESULTS: In a total of 226 patients, SSI incidences were 12.8 % (incisional), 4.0 % (organ/space), and 1.8 % (both). Univariate analysis showed that incisional SSIs were associated with high American Society of Anesthesiologists (ASA) scores, preoperative anemia, hypoalbuminemia, low prothrombin time, viral or alcoholic chronic hepatitis, liver cirrhosis, and prolonged operation times. Organ/space SSIs were associated with high rates of red blood cell transfusions, concomitant bowel surgery, and prolonged operation times. Multivariate analysis revealed that risk factors for incisional SSIs were anemia [odds ratio (OR) 2.82], high ASA scores (OR 2.88), presence of hepatitis or cirrhosis (OR 5.07), and prolonged operation times (OR 9.61). The only risk factor for organ/space SSIs was concomitant bowel surgery (OR 5.53). Hospital stays were similar in organ/space and incisional SSI groups, but significantly longer for those with both organ/space and incisional SSIs. CONCLUSIONS: High ASA scores, anemia, chronic hepatitis or liver cirrhosis, and prolonged operations increased the risk of incisional SSIs; concomitant bowel surgery increased the risk of organ/space SSI. Specific precautions to prevent organ/space and incisional SSIs may shorten hospital stays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical resistivity tomography (ERT) is a well-established method for geophysical characterization and has shown potential for monitoring geologic CO2 sequestration, due to its sensitivity to electrical resistivity contrasts generated by liquid/gas saturation variability. In contrast to deterministic inversion approaches, probabilistic inversion provides the full posterior probability density function of the saturation field and accounts for the uncertainties inherent in the petrophysical parameters relating the resistivity to saturation. In this study, the data are from benchtop ERT experiments conducted during gas injection into a quasi-2D brine-saturated sand chamber with a packing that mimics a simple anticlinal geological reservoir. The saturation fields are estimated by Markov chain Monte Carlo inversion of the measured data and compared to independent saturation measurements from light transmission through the chamber. Different model parameterizations are evaluated in terms of the recovered saturation and petrophysical parameter values. The saturation field is parameterized (1) in Cartesian coordinates, (2) by means of its discrete cosine transform coefficients, and (3) by fixed saturation values in structural elements whose shape and location is assumed known or represented by an arbitrary Gaussian Bell structure. Results show that the estimated saturation fields are in overall agreement with saturations measured by light transmission, but differ strongly in terms of parameter estimates, parameter uncertainties and computational intensity. Discretization in the frequency domain (as in the discrete cosine transform parameterization) provides more accurate models at a lower computational cost compared to spatially discretized (Cartesian) models. A priori knowledge about the expected geologic structures allows for non-discretized model descriptions with markedly reduced degrees of freedom. Constraining the solutions to the known injected gas volume improved estimates of saturation and parameter values of the petrophysical relationship. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alteration and contamination processes modify the chemical composition of ceramic artefacts. This is not restricted solely to the affected elements, but also affects general concentrations. This is due to the compositional nature of chemical data, enclosed by the restriction of unit sum. Since it is impossible to know prior to data treatment whether the original compositions have been changed by such processes, the methodological approach used in provenance studies must be robust enough to handle materials that might have been altered or contaminated. The ability of the logratio transformation proposed by Aitchison to handle compositional data is studied and compared with that of present data treatments. The logaratio transformation appears to offer the most robust approach

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines the mainstream categorical definition of coreference as "identity of reference." It argues that coreference is best handled when identity is treated as a continuum, ranging from full identity to non-identity, with room for near-identity relations to explain currently problematic cases. This middle ground is needed to account for those linguistic expressions in real text that stand in relations that are neither full coreference nor non-coreference, a situation that has led to contradictory treatment of cases in previous coreference annotation efforts. We discuss key issues for coreference such as conceptual categorization, individuation, criteria of identity, and the discourse model construct. We redefine coreference as a scalar relation between two (or more) linguistic expressions that refer to discourse entities considered to be at the same granularity level relevant to the linguistic and pragmatic context. We view coreference relations in terms of mental space theory and discuss a large number of real life examples that show near-identity at different degrees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topological solitons of two classical field theories, the Faddeev-Skyrme model and the Ginzburg-Landau model are studied numerically and analytically in this work. The aim is to gain information on the existence and properties of these topological solitons, their structure and behaviour under relaxation. First, the conditions and mechanisms leading to the possibility of topological solitons are explored from the field theoretical point of view. This leads one to consider continuous deformations of the solutions of the equations of motion. The results of algebraic topology necessary for the systematic treatment of such deformations are reviewed and methods of determining the homotopy classes of topological solitons are presented. The Faddeev-Skyrme and Ginzburg-Landau models are presented, some earlier results reviewed and the numerical methods used in this work are described. The topological solitons of the Faddeev-Skyrme model, Hopfions, are found to follow the same mechanisms of relaxation in three different domains with three different topological classifications. For two of the domains, the necessary but unusual topological classification is presented. Finite size topological solitons are not found in the Ginzburg-Landau model and a scaling argument is used to suggest that there are indeed none unless a certain modification to the model, due to R. S. Ward, is made. In that case, the Hopfions of the Faddeev-Skyrme model are seen to be present for some parameter values. A boundary in the parameter space separating the region where the Hopfions exist and the area where they do not exist is found and the behaviour of the Hopfion energy on this boundary is studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a new notion for the deformation of Gabor systems. Such deformations are in general nonlinear and, in particular, include the standard jitter error and linear deformations of phase space. With this new notion we prove a strong deformation result for Gabor frames and Gabor Riesz sequences that covers the known perturbation and deformation results. Our proof of the deformation theorem requires a new characterization of Gabor frames and Gabor Riesz sequences. It is in the style of Beurling's characterization of sets of sampling for bandlimited functions and extends significantly the known characterization of Gabor frames 'without inequalities' from lattices to non-uniform sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the present study was to examine the clinical validation of a Virtual Reality Environment (VRE) designed to normalize eating patterns in Eating Disorders (ED). The efficacy of VR in eliciting emotions, sense of presence and reality of the VRE were explored in 22 ED patients and 37 healthy eating individuals. The VRE (non-immersive) consisted of a kitchen room where participants had to eat a virtual pizza. In order to assess the sense of presence and reality produced by the VRE, participants answered seven questions with a Likert scale (0-10) during the experience, and then filled out the Reality Judgment and Presence Questionnaire (RJPQ) and ITC-Sense of Presence Inventory (ITC-SOPI). The results showed that the VRE induced a sense of presence and was felt as real for both groups, without differences in the experience of 'ease' with the VRE, sense of physical space, or the ecological validity assigned to the virtual kitchen and eating virtually. However, the ED patients reported paying more attention and experiencing greater emotional involvement and dysphoria after virtual eating. The results suggest that the VRE was clinically meaningful to the ED patients and might be a relevant therapy tool for normalizing their eating patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The efficiency of the chemiluminescence luminol method and colorimetric DPPH and ABTS methods in evaluating the antiradical capacity of pure compounds and plant extracts with antioxidant potential is compared. In case of pure compounds, the values of parameter 'n' (number of radicals quenched per molecule of antiradical) for ascorbic acid, p-hydroquinone, catechol, quercetin, and rutin are similar when measured by colorimetric assays; however, considerably lower values of n are obtained with the luminol assay. The antiradical activity of extracts from male and female individuals of Baccharis burchelli and Baccharis crispa were determined by the luminol assay and expressed using the new Trolox® percentage (%Trolox®) parameter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the thesis is to generate scenarios of future purposes and of use of ships, suitable for STX Finland Cruise Oy to design and build, over a 50 year time span by applying the Delphi method and an open innovation approach in a future workshop. The scenarios were mapped out with help of two Delphi survey rounds and one future workshop. The number of participants in both surveys and the workshop was some twenty experts in each, representing various fields. On the basis of the first survey round, four different subject areas were selected for analysis: purposes for the use of ships; energy efficiency of cruises and ships; cost efficiency of sea transportation and vacation; and the views and expectations of the customers in the future. As a result of the future workshop, four different themes were established, which were studied further during the second Delphi round. The themes are future service and operation concepts; versatile uses of the space in ships; communication of environmental benefits of ships, future energy solutions and social interaction between passengers onboard. In addition to generating the scenarios, further aim of the thesis is to implement the Delphi method and workshop activity as foresight tools for STX Europe and to produce a chart of a future shipbuilding foresight community to can serve the open innovation processes in the maritime cluster as a whole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Licentiate thesis we investigate the absolute ratio δ, j, ˜j and hyperbolic ρ metrics and their relations with each other. Various growth estimates are given for quasiconformal mpas both in plane and space. Some Hölder constants were refined with respect δ, j ˜j metrics. Some new results regarding the Hölder continuity of quasiconformal and quasiregular mapping of unit ball with respect to Euclidean and hyperbolic metrics are given, which were obtained by many authors in 1980’s. Applications are given to the study of metric space, quasiconformal and quasiregular maps in the plane and as well as in the space.