948 resultados para distributed model
Resumo:
We propose three research problems to explore the relations between trust and security in the setting of distributed computation. In the first problem, we study trust-based adversary detection in distributed consensus computation. The adversaries we consider behave arbitrarily disobeying the consensus protocol. We propose a trust-based consensus algorithm with local and global trust evaluations. The algorithm can be abstracted using a two-layer structure with the top layer running a trust-based consensus algorithm and the bottom layer as a subroutine executing a global trust update scheme. We utilize a set of pre-trusted nodes, headers, to propagate local trust opinions throughout the network. This two-layer framework is flexible in that it can be easily extensible to contain more complicated decision rules, and global trust schemes. The first problem assumes that normal nodes are homogeneous, i.e. it is guaranteed that a normal node always behaves as it is programmed. In the second and third problems however, we assume that nodes are heterogeneous, i.e, given a task, the probability that a node generates a correct answer varies from node to node. The adversaries considered in these two problems are workers from the open crowd who are either investing little efforts in the tasks assigned to them or intentionally give wrong answers to questions. In the second part of the thesis, we consider a typical crowdsourcing task that aggregates input from multiple workers as a problem in information fusion. To cope with the issue of noisy and sometimes malicious input from workers, trust is used to model workers' expertise. In a multi-domain knowledge learning task, however, using scalar-valued trust to model a worker's performance is not sufficient to reflect the worker's trustworthiness in each of the domains. To address this issue, we propose a probabilistic model to jointly infer multi-dimensional trust of workers, multi-domain properties of questions, and true labels of questions. Our model is very flexible and extensible to incorporate metadata associated with questions. To show that, we further propose two extended models, one of which handles input tasks with real-valued features and the other handles tasks with text features by incorporating topic models. Our models can effectively recover trust vectors of workers, which can be very useful in task assignment adaptive to workers' trust in the future. These results can be applied for fusion of information from multiple data sources like sensors, human input, machine learning results, or a hybrid of them. In the second subproblem, we address crowdsourcing with adversaries under logical constraints. We observe that questions are often not independent in real life applications. Instead, there are logical relations between them. Similarly, workers that provide answers are not independent of each other either. Answers given by workers with similar attributes tend to be correlated. Therefore, we propose a novel unified graphical model consisting of two layers. The top layer encodes domain knowledge which allows users to express logical relations using first-order logic rules and the bottom layer encodes a traditional crowdsourcing graphical model. Our model can be seen as a generalized probabilistic soft logic framework that encodes both logical relations and probabilistic dependencies. To solve the collective inference problem efficiently, we have devised a scalable joint inference algorithm based on the alternating direction method of multipliers. The third part of the thesis considers the problem of optimal assignment under budget constraints when workers are unreliable and sometimes malicious. In a real crowdsourcing market, each answer obtained from a worker incurs cost. The cost is associated with both the level of trustworthiness of workers and the difficulty of tasks. Typically, access to expert-level (more trustworthy) workers is more expensive than to average crowd and completion of a challenging task is more costly than a click-away question. In this problem, we address the problem of optimal assignment of heterogeneous tasks to workers of varying trust levels with budget constraints. Specifically, we design a trust-aware task allocation algorithm that takes as inputs the estimated trust of workers and pre-set budget, and outputs the optimal assignment of tasks to workers. We derive the bound of total error probability that relates to budget, trustworthiness of crowds, and costs of obtaining labels from crowds naturally. Higher budget, more trustworthy crowds, and less costly jobs result in a lower theoretical bound. Our allocation scheme does not depend on the specific design of the trust evaluation component. Therefore, it can be combined with generic trust evaluation algorithms.
Resumo:
Statistical association between a single nucleotide polymorphism (SNP) genotype and a quantitative trait in genome-wide association studies is usually assessed using a linear regression model, or, in the case of non-normally distributed trait values, using the Kruskal-Wallis test. While linear regression models assume an additive mode of inheritance via equi-distant genotype scores, Kruskal-Wallis test merely tests global differences in trait values associated with the three genotype groups. Both approaches thus exhibit suboptimal power when the underlying inheritance mode is dominant or recessive. Furthermore, these tests do not perform well in the common situations when only a few trait values are available in a rare genotype category (disbalance), or when the values associated with the three genotype categories exhibit unequal variance (variance heterogeneity). We propose a maximum test based on Marcus-type multiple contrast test for relative effect sizes. This test allows model-specific testing of either dominant, additive or recessive mode of inheritance, and it is robust against variance heterogeneity. We show how to obtain mode-specific simultaneous confidence intervals for the relative effect sizes to aid in interpreting the biological relevance of the results. Further, we discuss the use of a related all-pairwise comparisons contrast test with range preserving confidence intervals as an alternative to Kruskal-Wallis heterogeneity test. We applied the proposed maximum test to the Bogalusa Heart Study dataset, and gained a remarkable increase in the power to detect association, particularly for rare genotypes. Our simulation study also demonstrated that the proposed non-parametric tests control family-wise error rate in the presence of non-normality and variance heterogeneity contrary to the standard parametric approaches. We provide a publicly available R library nparcomp that can be used to estimate simultaneous confidence intervals or compatible multiplicity-adjusted p-values associated with the proposed maximum test.
Resumo:
A new type of space debris was recently discovered by Schildknecht in near -geosynchronous orbit (GEO). These objects were later identified as exhibiting properties associated with High Area-to-Mass ratio (HAMR) objects. According to their brightness magnitudes (light curve), high rotation rates and composition properties (albedo, amount of specular and diffuse reflection, colour, etc), it is thought that these objects are multilayer insulation (MLI). Observations have shown that this debris type is very sensitive to environmental disturbances, particularly solar radiation pressure, due to the fact that their shapes are easily deformed leading to changes in the Area-to-Mass ratio (AMR) over time. This thesis proposes a simple effective flexible model of the thin, deformable membrane with two different methods. Firstly, this debris is modelled with Finite Element Analysis (FEA) by using Bernoulli-Euler theory called “Bernoulli model”. The Bernoulli model is constructed with beam elements consisting 2 nodes and each node has six degrees of freedom (DoF). The mass of membrane is distributed in beam elements. Secondly, the debris based on multibody dynamics theory call “Multibody model” is modelled as a series of lump masses, connected through flexible joints, representing the flexibility of the membrane itself. The mass of the membrane, albeit low, is taken into account with lump masses in the joints. The dynamic equations for the masses, including the constraints defined by the connecting rigid rod, are derived using fundamental Newtonian mechanics. The physical properties of both flexible models required by the models (membrane density, reflectivity, composition, etc.), are assumed to be those of multilayer insulation. Both flexible membrane models are then propagated together with classical orbital and attitude equations of motion near GEO region to predict the orbital evolution under the perturbations of solar radiation pressure, Earth’s gravity field, luni-solar gravitational fields and self-shadowing effect. These results are then compared to two rigid body models (cannonball and flat rigid plate). In this investigation, when comparing with a rigid model, the evolutions of orbital elements of the flexible models indicate the difference of inclination and secular eccentricity evolutions, rapid irregular attitude motion and unstable cross-section area due to a deformation over time. Then, the Monte Carlo simulations by varying initial attitude dynamics and deformed angle are investigated and compared with rigid models over 100 days. As the results of the simulations, the different initial conditions provide unique orbital motions, which is significantly different in term of orbital motions of both rigid models. Furthermore, this thesis presents a methodology to determine the material dynamic properties of thin membranes and validates the deformation of the multibody model with real MLI materials. Experiments are performed in a high vacuum chamber (10-4 mbar) replicating space environment. A thin membrane is hinged at one end but free at the other. The free motion experiment, the first experiment, is a free vibration test to determine the damping coefficient and natural frequency of the thin membrane. In this test, the membrane is allowed to fall freely in the chamber with the motion tracked and captured through high velocity video frames. A Kalman filter technique is implemented in the tracking algorithm to reduce noise and increase the tracking accuracy of the oscillating motion. The forced motion experiment, the last test, is performed to determine the deformation characteristics of the object. A high power spotlight (500-2000W) is used to illuminate the MLI and the displacements are measured by means of a high resolution laser sensor. Finite Element Analysis (FEA) and multibody dynamics of the experimental setups are used for the validation of the flexible model by comparing with the experimental results of displacements and natural frequencies.
Resumo:
International audience
Resumo:
The Ocean Model Intercomparison Project (OMIP) aims to provide a framework for evaluating, understanding, and improving the ocean and sea-ice components of global climate and earth system models contributing to the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses these aims in two complementary manners: (A) by providing an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing, (B) by providing a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) offering details for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows that of the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II have become the standard method to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP (Scenario MIP), as well as the ocean-sea ice OMIP simulations. The bulk of this paper offers scientific rationale for saving these diagnostics.
Resumo:
The Ocean Model Intercomparison Project (OMIP) is an endorsed project in the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses CMIP6 science questions, investigating the origins and consequences of systematic model biases. It does so by providing a framework for evaluating (including assessment of systematic biases), understanding, and improving ocean, sea-ice, tracer, and biogeochemical components of climate and earth system models contributing to CMIP6. Among the WCRP Grand Challenges in climate science (GCs), OMIP primarily contributes to the regional sea level change and near-term (climate/decadal) prediction GCs. OMIP provides (a) an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing; and (b) a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) detailing methods for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II (Interannual Forcing) have become the standard methods to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP, HighResMIP (High Resolution MIP), as well as the ocean/sea-ice OMIP simulations.
Resumo:
Part 11: Reference and Conceptual Models
Resumo:
Neste artigo, pretende-se desenvolver uma versão desagregada da abordagem pós-Keynesiana para o crescimento econômico, mostrando que de fato esse modelo pode ser tratado como um caso particular do modelo Pasinettiano de mudança estrutural e crescimento econômico. Utilizando-se o conceito de integração vertical, torna-se possível conduzir a análise iniciada por Kaldor (1956) e Robinson (1956, 1962), e seguido por Dutt (1984), Rowthorn (1982) e, posteriormente, Bhaduri e Marglin (1990) em um modelo multi-sectorial em que há aumentos da demanda e produtividade em ritmos diferentes em cada setor. Ao adotar essa abordagem, é possível mostrar que a dinâmica de mudança estrutural está condicionada não apenas aos padrões de demanda de evolução das preferências e da difusão do progresso tecnológico, mas também com as características distributivas da economia, que podem dar origem a diferentes regimes setoriais de crescimento econômico. Além disso, é possível determinar a taxa natural de lucro que faz com que a taxa de mark-up seja constante ao longo do tempo. _________________________________________________________________________________ ABSTRACT
Resumo:
The rise of neoliberalism and the experience of several economic crises throughout 1960’s and 70’s have opened the way to question the ability of welfare state to satisfy the basic needs of the societies. Therefore the term “welfare state” left its place to “welfare regime” in which the responsibilities for the well being of the societies are distributed among state, market and families. Following the introduction of this new term, several typologies of welfare regimes are started to be discussed. Esping-Andersen’s (1990) regime typology is considered to be one of the most significant one which covers most of the European countries. On the other hand, it has also led to criticisms for being lack of several aspects. One of them was done by Ferrera (1996), Moreno (2001), Boboli (1997) and Liebfreid (1992), which discusses that the grouping of Mediterranean countries of Europe -Greece, Italy, Spain and Portugal- within the conservative regime type. Those authors affirm that Southern European countries have their peculiar features in terms of structure of welfare provision and they form a fourth type which may be called "Mediterranean/ Southern European Regime". At this point, this doctoral thesis carries the discussion one step further and covers a profound research to answer some fundamental questions. Chiefly, clarifying whether it is possible to talk about a coherent grouping between the Mediterranean countries of Southern Europe in terms of their welfare regimes is our first objective. Then by assuming that it has an affirmative response, it is aimed to reflect the characteristics of this grouping. On the other hand, those group features are not static in time and they are sensible to various economic changes...
Resumo:
Sepsis is commonly associated with brain dysfunction, but the underlying mechanisms remain unclear, although mitochondrial dysfunction and microvascular abnormalities have been implicated. We therefore assessed whether cerebral mitochondrial dysfunction during systemic endotoxemia in mice increased mitochondrial sensitivity to a further bioenergetic insult (hyoxemia), and whether hypothermia could improve outcome. Mice (C57bl/6) were injected intraperitoneally with lipopolysaccharide (LPS) (5 mg/kg; n = 85) or saline (0.01 ml/g; n = 47). Six, 24 and 48 h later, we used confocal imaging in vivo to assess cerebral mitochondrial redox potential and cortical oxygenation in response to changes in inspired oxygen. The fraction of inspired oxygen (FiO2) at which the cortical redox potential changed was compared between groups. In a subset of animals, spontaneous hypothermia was maintained or controlled hypothermia induced during imaging. Decreasing FiO2 resulted in a more reduced cerebral redox state around veins, but preserved oxidation around arteries. This pattern appeared at a higher FiO2 in LPS-injected animals, suggesting an increased sensitivity of cortical mitochondria to hypoxemia. This increased sensitivity was accompanied by a decrease in cortical oxygenation, but was attenuated by hypothermia. These results suggest that systemic endotoxemia influences cortical oxygenation and mitochondrial function, and that therapeutic hypothermia can be protective.
Resumo:
Data sources are often dispersed geographically in real life applications. Finding a knowledge model may require to join all the data sources and to run a machine learning algorithm on the joint set. We present an alternative based on a Multi Agent System (MAS): an agent mines one data source in order to extract a local theory (knowledge model) and then merges it with the previous MAS theory using a knowledge fusion technique. This way, we obtain a global theory that summarizes the distributed knowledge without spending resources and time in joining data sources. New experiments have been executed including statistical significance analysis. The results show that, as a result of knowledge fusion, the accuracy of initial theories is significantly improved as well as the accuracy of the monolithic solution.
Resumo:
Nonhuman primates (NHPs) are important animal models for the study of human health and disease. In particular, the use of NHPs to study the vaginal microbiome and susceptibility to infections (such as HIV and herpesvirus) is exceptionally valuable due to the similarity in anatomy and physiology. An important aspect to this is maintaining a healthy vaginal microbiome which then minimizes colonization by pathogens and resulting inflammation along the mucosa. In women, conditions such as bacterial vaginosis (BV) are frequently treated with antibiotics such as metronidazole or clindamycin. Due to the excessive use of antimicrobials in medicine and agriculture, alternative compounds and therapies are highly desired to treat infections. Approaches that have been developed and used for vaginal infections includes the use of natural antimicrobials such as essential oils, probiotics, and live cultures, which mimic and function like antibiotics but lack development of resistance like classic antibiotics. However, these approaches have been minimally studied in humans and animals. Effectiveness of essential oils are anecdotal at best. Microbiome manipulation on the other hand has been investigated more thoroughly. Novel products are being distributed for medical use and are monotherapies containing Lactobacillus which colonize the vaginal mucosa (Ali et al., 2020; Brichacek et al., 2013; Lagenaur, Sanders-Beer, et al., 2011). Unfortunately, these therapies have limitations due to durability and individual response in women. By evaluating the extent by which the NHP vaginal mucosa can be colonized with exogenously delivered bacteria, this animal model will highlight the NHP for use in translational studies which use essential oils and beneficial microbiome bacteria for vaginal delivery.
Resumo:
Distributed argumentation technology is a computational approach incorporating argumentation reasoning mechanisms within multi-agent systems. For the formal foundations of distributed argumentation technology, in this thesis we conduct a principle-based analysis of structured argumentation as well as abstract multi-agent and abstract bipolar argumentation. The results of the principle-based approach of these theories provide an overview and guideline for further applications of the theories. Moreover, in this thesis we explore distributed argumentation technology using distributed ledgers. We envision an Intelligent Human-input-based Blockchain Oracle (IHiBO), an artificial intelligence tool for storing argumentation reasoning. We propose a decentralized and secure architecture for conducting decision-making, addressing key concerns of trust, transparency, and immutability. We model fund management with agent argumentation in IHiBO and analyze its compliance with European fund management legal frameworks. We illustrate how bipolar argumentation balances pros and cons in legal reasoning in a legal divorce case, and how the strength of arguments in natural language can be represented in structured arguments. Finally, we discuss how distributed argumentation technology can be used to advance risk management, regulatory compliance of distributed ledgers for financial securities, and dialogue techniques.
Resumo:
The main objective of my thesis work is to exploit the Google native and open-source platform Kubeflow, specifically using Kubeflow pipelines, to execute a Federated Learning scalable ML process in a 5G-like and simplified test architecture hosting a Kubernetes cluster and apply the largely adopted FedAVG algorithm and FedProx its optimization empowered by the ML platform ‘s abilities to ease the development and production cycle of this specific FL process. FL algorithms are more are and more promising and adopted both in Cloud application development and 5G communication enhancement through data coming from the monitoring of the underlying telco infrastructure and execution of training and data aggregation at edge nodes to optimize the global model of the algorithm ( that could be used for example for resource provisioning to reach an agreed QoS for the underlying network slice) and after a study and a research over the available papers and scientific articles related to FL with the help of the CTTC that suggests me to study and use Kubeflow to bear the algorithm we found out that this approach for the whole FL cycle deployment was not documented and may be interesting to investigate more in depth. This study may lead to prove the efficiency of the Kubeflow platform itself for this need of development of new FL algorithms that will support new Applications and especially test the FedAVG algorithm performances in a simulated client to cloud communication using a MNIST dataset for FL as benchmark.
Resumo:
In this thesis, we state the collision avoidance problem as a vertex covering problem, then we consider a distributed framework in which a team of cooperating Unmanned Vehicles (UVs) aim to solve this optimization problem cooperatively to guarantee collision avoidance between group members. For this purpose, we implement a distributed control scheme based on a robust Set-Theoretic Model Predictive Control ( ST-MPC) strategy, where the problem involves vehicles with independent dynamics but with coupled constraints, to capture required cooperative behavior.