884 resultados para Problem analysis
Resumo:
Malware detection is a growing problem particularly on the Android mobile platform due to its increasing popularity and accessibility to numerous third party app markets. This has also been made worse by the increasingly sophisticated detection avoidance techniques employed by emerging malware families. This calls for more effective techniques for detection and classification of Android malware. Hence, in this paper we present an n-opcode analysis based approach that utilizes machine learning to classify and categorize Android malware. This approach enables automated feature discovery that eliminates the need for applying expert or domain knowledge to define the needed features. Our experiments on 2520 samples that were performed using up to 10-gram opcode features showed that an f-measure of 98% is achievable using this approach.
Resumo:
We investigate the performance of dual-hop two-way amplify-and-forward (AF) relaying in the presence of inphase and quadrature-phase imbalance (IQI) at the relay node. In particular, the effective signal-to-interference-plus-noise ratio (SINR) at both sources is derived. These SINRs are used to design an instantaneous power allocation scheme, which maximizes the minimum SINR of the two sources under a total transmit power constraint. The solution to this optimization problem is analytically determined and used to evaluate the outage probability (OP) of the considered two-way AF relaying system. Both analytical and numerical results show that IQI can create fundamental performance limits on two-way relaying, which cannot be avoided by simply improving the channel conditions.
Resumo:
There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.
Resumo:
[ES]This paper describes an analysis performed for facial description in static images and video streams. The still image context is first analyzed in order to decide the optimal classifier configuration for each problem: gender recognition, race classification, and glasses and moustache presence. These results are later applied to significant samples which are automatically extracted in real-time from video streams achieving promising results in the facial description of 70 individuals by means of gender, race and the presence of glasses and moustache.
Resumo:
Polymer Optical Fibers have occupied historically a place for large core flexible fibers operating in short distances. In addition to their practical passive application in short-haul communication they constitute a potential research field as active devices with organic dopants. Organic dyes are preferred as dopants over organic semiconductors due to their higher optical cross section. Thus organic dyes as gain media in a polymer fiber is used to develop efficient and narrow laser sources with a tunability throughout the visible region or optical amplifier with high gain. Dyes incorporated in fiber form has added advantage over other solid state forms such as films since the pump power required to excite the molecules in the core of the fiber is less thereby utilising the pump power effectively. In 1987, Muto et.al investigated a dye doped step index polymer fiber laser. Afterwards, numerous researches have been carried out in this area demonstrating laser emission from step index, graded index and hollow optical fibers incorporating various dyes. Among various dyes, Rhodamine6G is the most widely and commonly used laser dye for the last four decades. Rhodamine6G has many desirable optical properties which make it preferable over other organic dyes such as Coumarin, Nile Blue, Curcumin etc. The research focus on the implementation of efficient fiber lasers and amplifiers for short fiber distances. Developing efficient plastic lasers with electrical pumping can be a new proposal in this field which demands lowest possible threshold pump energy of the gain medium in the cavity as an important parameter. One way of improving the efficiency of the lasers, through low threshold pump energy, is by modifying the gain of the amplifiers in the resonator/cavity. Success in the field of Radiative Decay Engineering can pave way to this problem. Laser gain media consisting of dye-nanoparticle composites can improve the efficiency by lowering the lasing threshold and enhancing the photostability. The electric field confined near the surface of metal nanoparticles due to Localized Surface Plasmon Resonance can be very effective for the excitation of active centers to impart high optical gain for lasing. Since the Surface Plasmon Resonance of nanoparticles of gold and silver lies in the visible range, it can affect the spectral emission characteristics of organic dyes such as Rhodamine6G through plasmon field generated by the particles. The change in emission of the dye placed near metal nanoparticles depend on plasmon field strength which in turn depends on the type of metal, size of nanoparticle, surface modification of the particle and the wavelength of incident light. Progress in fabrication of different types of nanostructures lead to the advent of nanospheres, nanoalloys, core-shell and nanowires to name a few. The thesis deals with the fabrication and characterisation of polymer optical fibers with various metallic and bimetallic nanostructures incorporated in the gain media for efficient fiber lasers with low threshold and improved photostability.
Resumo:
The use of the Design by Analysis (DBA) route is a modern trend in pressure vessel and piping international codes in mechanical engineering. However, to apply the DBA to structures under variable mechanical and thermal loads, it is necessary to assure that the plastic collapse modes, alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case), be precluded. The tool available to achieve this target is the shakedown theory. Unfortunately, the practical numerical applications of the shakedown theory result in very large nonlinear optimization problems with nonlinear constraints. Precise, robust and efficient algorithms and finite elements to solve this problem in finite dimension has been a more recent achievements. However, to solve real problems in an industrial level, it is necessary also to consider more realistic material properties as well as to accomplish 3D analysis. Limited kinematic hardening, is a typical property of the usual steels and it should be considered in realistic applications. In this paper, a new finite element with internal thermodynamical variables to model kinematic hardening materials is developed and tested. This element is a mixed ten nodes tetrahedron and through an appropriate change of variables is possible to embed it in a shakedown analysis software developed by Zouain and co-workers for elastic ideally-plastic materials, and then use it to perform 3D shakedown analysis in cases with limited kinematic hardening materials
Resumo:
The Cyprus dispute accurately portrays the evolution of the conflict from ‘warfare to lawfare’ enriched in politics; this research has proven that the Cyprus problem has been and will continue to be one of the most judicialised disputes across the globe. Notwithstanding the ‘normalisation’ of affairs between the two ethno-religious groups on the island since the division in 1974, the Republic of Cyprus’ (RoC) European Union (EU) membership in 2004 failed to catalyse reunification and terminate the legal, political and economic isolation of the Turkish Cypriot community. So the question is; why is it that the powerful legal order of the EU continuously fails to tame the tiny troublesome island of Cyprus? This is a thesis on the interrelationship of the EU legal order and the Cyprus problem. A literal and depoliticised interpretation of EU law has been maintained throughout the EU’s dealings with Cyprus, hence, pre-accession and post-accession. The research has brought to light that this literal interpretation of EU law vis-à-vis Cyprus has in actual fact deepened the division on the island. Pessimists outnumber optimists so far as resolving this problem is concerned, and rightly so if you look back over the last forty years of failed attempts to do just that, a diplomatic combat zone scattered with the bones of numerous mediators. This thesis will discuss how the decisions of the EU institutions, its Member States and specifically of the European Court of Justice, despite conforming to the EU legal order, have managed to disregard the principle of equality on the divided island and thus prevent the promised upgrade of the status of the Turkish Cypriot community since 2004. Indeed, whether a positive or negative reading of the Union’s position towards the Cyprus problem is adopted, the case remains valid for an organisation based on the rule of law to maintain legitimacy, democracy, clarity and equality to the decisions of its institutions. Overall, the aim of this research is to establish a link between the lack of success of the Union to build a bridge over troubled waters and the right of self-determination of the Turkish Cypriot community. The only way left for the EU to help resolve the Cyprus problem is to aim to broker a deal between the two Cypriot communities which will permit the recognition of the Turkish Republic of Northern Cyprus (TRNC) or at least the ‘Taiwanisation’ of Northern Cyprus. Albeit, there are many studies that address the impact of the EU on the conflict or the RoC, which represents the government that has monopolised EU accession, the argument advanced in this thesis is that despite the alleged Europeanisation of the Turkish Cypriot community, they are habitually disregarded because of the EU’s current legal framework and the Union’s lack of conflict transformation strategy vis-à-vis the island. Since the self-declared TRNC is not recognised and EU law is suspended in northern Cyprus in accordance with Protocol No 10 on Cyprus of the Act of Accession 2003, the Turkish-Cypriots represent an idiomatic partner of Brussels but the relations between the two resemble the experience of EU enlargement: the EU’s relevance to the community has been based on the prospects for EU accession (via reunification) and assistance towards preparation for potential EU integration through financial and technical aid. Undeniably, the pre-accession and postaccession strategy of Brussels in Cyprus has worsened the Cyprus problem and hindered the peace process. The time has come for the international community to formally acknowledge the existence of the TRNC.
Resumo:
La Banque mondiale propose la bonne gouvernance comme la stratégie visant à corriger les maux de la mauvaise gouvernance et de faciliter le développement dans les pays en développement (Carayannis, Pirzadeh, Popescu & 2012; & Hilyard Wilks 1998; Leftwich 1993; Banque mondiale, 1989). Dans cette perspective, la réforme institutionnelle et une arène de la politique publique plus inclusive sont deux stratégies critiques qui visent à établir la bonne gouvernance, selon la Banque et d’autres institutions de Bretton Woods. Le problème, c’est que beaucoup de ces pays en voie de développement ne possèdent pas l’architecture institutionnelle préalable à ces nouvelles mesures. Cette thèse étudie et explique comment un état en voie de développement, le Commonwealth de la Dominique, s’est lancé dans un projet de loi visant l’intégrité dans la fonction publique. Cette loi, la Loi sur l’intégrité dans la fonction publique (IPO) a été adoptée en 2003 et mis en œuvre en 2008. Cette thèse analyse les relations de pouvoir entre les acteurs dominants autour de évolution de la loi et donc, elle emploie une combinaison de technique de l’analyse des réseaux sociaux et de la recherche qualitative pour répondre à la question principale: Pourquoi l’État a-t-il développé et mis en œuvre la conception actuelle de la IPO (2003)? Cette question est d’autant plus significative quand nous considérons que contrairement à la recherche existante sur le sujet, l’IPO dominiquaise diverge considérablement dans la structure du l’IPO type idéal. Nous affirmons que les acteurs "rationnels," conscients de leur position structurelle dans un réseau d’acteurs, ont utilisé leurs ressources de pouvoir pour façonner l’institution afin qu’elle serve leurs intérêts et ceux et leurs alliés. De plus, nous émettons l’hypothèse que: d’abord, le choix d’une agence spécialisée contre la corruption et la conception ultérieure de cette institution reflètent les préférences des acteurs dominants qui ont participé à la création de ladite institution et la seconde, notre hypothèse rivale, les caractéristiques des modèles alternatifs d’institutions de l’intégrité publique sont celles des acteurs non dominants. Nos résultats sont mitigés. Le jeu de pouvoir a été limité à un petit groupe d’acteurs dominants qui ont cherché à utiliser la création de la loi pour assurer leur légitimité et la survie politique. Sans surprise, aucun acteur n’a avancé un modèle alternatif. Nous avons conclu donc que la loi est la conséquence d’un jeu de pouvoir partisan. Cette recherche répond à la pénurie de recherche sur la conception des institutions de l’intégrité publique, qui semblent privilégier en grande partie un biais organisationnel et structurel. De plus, en étudiant le sujet du point de vue des relations de pouvoir (le pouvoir, lui-même, vu sous l’angle actanciel et structurel), la thèse apporte de la rigueur conceptuelle, méthodologique, et analytique au discours sur la création de ces institutions par l’étude de leur genèse des perspectives tant actancielles que structurelles. En outre, les résultats renforcent notre capacité de prédire quand et avec quelle intensité un acteur déploierait ses ressources de pouvoir.
Resumo:
Despite its huge potential in risk analysis, the Dempster–Shafer Theory of Evidence (DST) has not received enough attention in construction management. This paper presents a DST-based approach for structuring personal experience and professional judgment when assessing construction project risk. DST was innovatively used to tackle the problem of lacking sufficient information through enabling analysts to provide incomplete assessments. Risk cost is used as a common scale for measuring risk impact on the various project objectives, and the Evidential Reasoning algorithm is suggested as a novel alternative for aggregating individual assessments. A spreadsheet-based decision support system (DSS) was devised to facilitate the proposed approach. Four case studies were conducted to examine the approach's viability. Senior managers in four British construction companies tried the DSS and gave very promising feedback. The paper concludes that the proposed methodology may contribute to bridging the gap between theory and practice of construction risk assessment.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
In a world where students are increasing digitally tethered to powerful, ‘always on’ mobile devices, new models of engagement and approaches to teaching and learning are required from educators. Serious Games (SG) have proved to have instructional potential but there is still a lack of methodologies and tools not only for their design but also to support game analysis and assessment. This paper explores the use of SG to increase student engagement and retention. The development phase of the Circuit Warz game is presented to demonstrate how electronic engineering education can be radically reimagined to create immersive, highly engaging learning experiences that are problem-centered and pedagogically sound. The Learning Mechanics–Game Mechanics (LM-GM) framework for SG game analysis is introduced and its practical use in an educational game design scenario is shown as a case study.
Resumo:
Ecosystem service assessment and management are shaped by the scale at which they are conducted; however, there has been little systematic investigation of the scales associated with ecosystem service processes, such as production, benefit distribution, and management. We examined how social-ecological spatial scale impacts ecosystem service assessment by comparing how ecosystem service distribution, trade-offs, and bundles shift across spatial scales. We used a case study in Québec, Canada, to analyze the scales of production, consumption, and management of 12 ecosystem services and to analyze how interactions among 7 of these ecosystem services change across 3 scales of observation (1, 9, and 75 km²). We found that ecosystem service patterns and interactions were relatively robust across scales of observation; however, we identified 4 different types of scale mismatches among ecosystem service production, consumption, and management. Based on this analysis, we have proposed 4 aspects of scale that ecosystem service assessments should consider.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08