969 resultados para Multi-instance Fusion,
Resumo:
Field lab: Business project
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
This present study aimed to investigate the fatigue life of unused (new) endodontic instruments made of NiTi with control memory by Coltene™ and subjected to the multi curvature of a mandibular first molar root canal. Additionally, the instrument‟s structural behaviour was analysed through non-linear finite element analysis (FEA). The fatigue life of twelve Hyflex™ CM files was assessed while were forced to adopt a stance with multiple radius of curvature, similar to the ones usually found in a mandibular first molar root canal; nine of them were subjected to Pecking motion, a relative movement of axial type. To achieve this, it was designed an experimental setup with the aim of timing the instruments until fracture while worked inside a stainless steel mandibular first molar model with relative axial motion to simulate the pecking motion. Additionally, the model‟s root canal multi-curvature was confirmed by radiography. The non-linear finite element analysis was conducted using the computer aided design software package SolidWorks™ Simulation, in order to define the imposed displacement required by the FEA, it was necessary to model an endodontic instrument with simplified geometry using SolidWorks™ and subsequently analyse the geometry of the root canal CAD model. The experimental results shown that the instruments subjected to pecking motion displayed higher fatigue life values and higher lengths of fractured tips than those with only rotational relative movement. The finite element non-linear analyses shown, for identical conditions, maximum values for the first principal stress lower than the yield strength of the material and those were located in similar positions to the instrument‟s fracture location determined by the experimental testing results.
Resumo:
It’s impossible to neglect the changes that internet and e-commerce caused in the retail sector, by increasing customers’ expectations and forcing retailers to adapt the business to the new digital era. Internet is characterized by the increase in accessibility to everyone, which can be good or not so. For instance, luxury products rely on the sense of exclusivity, instead of being accessible to everyone. Hence, internet represents a challenge for luxury brands once, although they are able to provide a fullness service to their customers, they need to maintain the exclusiveness in which luxury is sustained. Consequently, the appearance of omni-channel was more than a challenge for the luxury sector, in particular, given the need to provide a full integrated experience through different channels. The aim of this dissertation is to find out how important is omni-channel, even in the luxury industry, and how it’s actually implemented based on the case of one of the most successful companies on luxury fashion e-commerce industry – Farfetch. Even though the company started in London, its founder is a Portuguese entrepreneur, and it’s in Portugal where most of its employees work, divided in two offices – Guimarães e Porto. Therefore, a literature review was written on relevant concepts and ideas about luxury, e-commerce and the different channels’ approaches. There were formulated five propositions that were after discussed according to the information gathered about the company and its strategies. In the end, it was possible to identify which propositions are in accordance with theory and which are not, as well as understand which are the most important strategies and trends about omni-channel in the luxury fashion e-commerce sector.
Resumo:
Autor proof
Resumo:
This paper presents a methodology based on the Bayesian data fusion techniques applied to non-destructive and destructive tests for the structural assessment of historical constructions. The aim of the methodology is to reduce the uncertainties of the parameter estimation. The Young's modulus of granite stones was chosen as an example for the present paper. The methodology considers several levels of uncertainty since the parameters of interest are considered random variables with random moments. A new concept of Trust Factor was introduced to affect the uncertainty related to each test results, translated by their standard deviation, depending on the higher or lower reliability of each test to predict a certain parameter.
Resumo:
Earthworks involve the levelling or shaping of a target area through the moving or processing of the ground surface. Most construction projects require earthworks, which are heavily dependent on mechanical equipment (e.g., excavators, trucks and compactors). Often, earthworks are the most costly and time-consuming component of infrastructure constructions (e.g., road, railway and airports) and current pressure for higher productivity and safety highlights the need to optimize earthworks, which is a nontrivial task. Most previous attempts at tackling this problem focus on single-objective optimization of partial processes or aspects of earthworks, overlooking the advantages of a multi-objective and global optimization. This work describes a novel optimization system based on an evolutionary multi-objective approach, capable of globally optimizing several objectives simultaneously and dynamically. The proposed system views an earthwork construction as a production line, where the goal is to optimize resources under two crucial criteria (costs and duration) and focus the evolutionary search (non-dominated sorting genetic algorithm-II) on compaction allocation, using linear programming to distribute the remaining equipment (e.g., excavators). Several experiments were held using real-world data from a Portuguese construction site, showing that the proposed system is quite competitive when compared with current manual earthwork equipment allocation.
Resumo:
Traffic Engineering (TE) approaches are increasingly impor- tant in network management to allow an optimized configuration and resource allocation. In link-state routing, the task of setting appropriate weights to the links is both an important and a challenging optimization task. A number of different approaches has been put forward towards this aim, including the successful use of Evolutionary Algorithms (EAs). In this context, this work addresses the evaluation of three distinct EAs, a single and two multi-objective EAs, in two tasks related to weight setting optimization towards optimal intra-domain routing, knowing the network topology and aggregated traffic demands and seeking to mini- mize network congestion. In both tasks, the optimization considers sce- narios where there is a dynamic alteration in the state of the system, in the first considering changes in the traffic demand matrices and in the latter considering the possibility of link failures. The methods will, thus, need to simultaneously optimize for both conditions, the normal and the altered one, following a preventive TE approach towards robust configurations. Since this can be formulated as a bi-objective function, the use of multi-objective EAs, such as SPEA2 and NSGA-II, came nat- urally, being those compared to a single-objective EA. The results show a remarkable behavior of NSGA-II in all proposed tasks scaling well for harder instances, and thus presenting itself as the most promising option for TE in these scenarios.
Resumo:
Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.
Resumo:
Long term applications of leguminous green mulch could increase mineralizable nitrogen (N) beneath cupuaçu trees produced on the infertile acidic Ultisols and Oxisols of the Amazon Basin. However, low quality standing cupuaçu litter could interfere with green mulch N release and soil N mineralization. This study compared mineral N, total N, and microbial biomass N beneath cupuaçu trees grown in two different agroforestry systems, north of Manaus, Brazil, following seven years of different green mulch application rates. To test for net interactions between green mulch and cupuaçu litter, dried gliricidia and inga leaves were mixed with senescent cupuaçu leaves, surface applied to an Oxisol soil, and incubated in a greenhouse for 162 days. Leaf decomposition, N release and soil N mineralization were periodically measured in the mixed species litter treatments and compared to single species applications. The effect of legume biomass and cupuaçu litter on soil mineral N was additive implying that recommendations for green mulch applications to cupuaçu trees can be based on N dynamics of individual green mulch species. Results demonstrated that residue quality, not quantity, was the dominant factor affecting the rate of N release from leaves and soil N mineralization in a controlled environment. In the field, complex N cycling and other factors, including soil fauna, roots, and microclimatic effects, had a stronger influence on available soil N than residue quality.
Resumo:
Scientific and technological advancements in the area of fibrous and textile materials have greatly enhanced their application potential in several high-end technical and industrial sectors including construction, transportation, medical, sports, aerospace engineering, electronics and so on. Excellent performance accompanied by light-weight, mechanical flexibility, tailor-ability, design flexibility, easy fabrication and relatively lower cost are the driving forces towards wide applications of these materials. Cost-effective fabrication of various advanced and functional materials for structural parts, medical devices, sensors, energy harvesting devices, capacitors, batteries, and many others has been possible using fibrous and textile materials. Structural membranes are one of the innovative applications of textile structures and these novel building skins are becoming very popular due to flexible design aesthetics, durability, lightweight and cost benefits. Current demand on high performance and multi-functional materials in structural applications has motivated to go beyond the basic textile structures used for structural membranes and to use innovative textile materials. Structural membranes with self-cleaning, thermoregulation and energy harvesting capability (using solar cells) are examples of such recently developed multi-functional membranes. Besides these, there exist enormous opportunities to develop wide varieties of multi-functional membranes using functional textile materials. Additionally, it is also possible to further enhance the performance and functionalities of structural membranes using advanced fibrous architectures such as 2D, 3D, hybrid, multi-layer and so on. In this context, the present paper gives an overview of various advanced and functional fibrous and textile materials which have enormous application potential in structural membranes.
Resumo:
We search for evidence of physics beyond the Standard Model in the production of final states with multiple high transverse momentum jets, using 20.3 fb−1 of proton-proton collision data recorded by the ATLAS detector at s√ = 8 TeV. No excess of events beyond Standard Model expectations is observed, and upper limits on the visible cross-section for non-Standard Model production of multi-jet final states are set. Using a wide variety of models for black hole and string ball production and decay, the limit on the cross-section times acceptance is as low as 0.16 fb at the 95% CL for a minimum scalar sum of jet transverse momentum in the event of about 4.3 TeV. Using models for black hole and string ball production and decay, exclusion contours are determined as a function of the production mass threshold and the gravity scale. These limits can be interpreted in terms of lower-mass limits on black hole and string ball production that range from 4.6 to 6.2 TeV.
Resumo:
A search for a charged Higgs boson, H±, decaying to a W± boson and a Z boson is presented. The search is based on 20.3 fb−1 of proton-proton collision data at a center-of-mass energy of 8 TeV recorded with the ATLAS detector at the LHC. The H± boson is assumed to be produced via vector-boson fusion and the decays W±→qq′¯ and Z→e+e−/μ+μ− are considered. The search is performed in a range of charged Higgs boson masses from 200 to 1000 GeV. No evidence for the production of an H± boson is observed. Upper limits of 31--1020 fb at 95% CL are placed on the cross section for vector-boson fusion production of an H± boson times its branching fraction to W±Z. The limits are compared with predictions from the Georgi-Machacek Higgs Triplet Model.
Resumo:
A search for heavy long-lived multi-charged particles is performed using the ATLAS detector at the LHC. Data collected in 2012 at s√=8 TeV from pp collisions corresponding to an integrated luminosity of 20.3 fb−1 are examined. Particles producing anomalously high ionisation, consistent with long-lived massive particles with electric charges from |q|=2e to |q|=6e are searched for. No signal candidate events are observed, and 95% confidence level cross-section upper limits are interpreted as lower mass limits for a Drell--Yan production model. The mass limits range between 660 and 785 GeV.
Resumo:
In this paper, we propose an extension of the firefly algorithm (FA) to multi-objective optimization. FA is a swarm intelligence optimization algorithm inspired by the flashing behavior of fireflies at night that is capable of computing global solutions to continuous optimization problems. Our proposal relies on a fitness assignment scheme that gives lower fitness values to the positions of fireflies that correspond to non-dominated points with smaller aggregation of objective function distances to the minimum values. Furthermore, FA randomness is based on the spread metric to reduce the gaps between consecutive non-dominated solutions. The obtained results from the preliminary computational experiments show that our proposal gives a dense and well distributed approximated Pareto front with a large number of points.