70 resultados para Android maps mappe gcm push notification push-notification sensors sensori
Resumo:
The EU is considered to be one of the main proponents of what has been called the deep trade agenda—that is, the push for further trade liberalization with an emphasis on the removal of domestic non-tariff regulatory measures affecting trade, as opposed to the traditional focus on the removal of trade barriers at borders. As negotiations on the Doha Development Round have stalled, the EU has attempted to achieve these aims by entering into comprehensive free trade agreements (FTAs) that are not only limited exclusively to tariffs but also extend to non-tariff barriers, including services, intellectual property rights (IPRs), competition, and investment. These FTAs place great emphasis on regulatory convergence as a means to secure greater market openings. The paper examines the EU's current external trade policy in the area of IP, particularly its attempts to promote its own regulatory model for the protection of IP rights through trade agreements. By looking at the IP enforcement provisions of such agreements, the article also examines how the divisive issues that are currently hindering the progress of negotiations at WTO level, including the demands from developing countries to maintain a degree of autonomy in the area of IP regulation as well as the need to balance IP protection with human rights protection, are being dealt with in recent EU FTAs.
Resumo:
Mobile malware has continued to grow at an alarming rate despite on-going mitigation efforts. This has been much more prevalent on Android due to being an open platform that is rapidly overtaking other competing platforms in the mobile smart devices market. Recently, a new generation of Android malware families has emerged with advanced evasion capabilities which make them much more difficult to detect using conventional methods. This paper proposes and investigates a parallel machine learning based classification approach for early detection of Android malware. Using real malware samples and benign applications, a composite classification model is developed from parallel combination of heterogeneous classifiers. The empirical evaluation of the model under different combination schemes demonstrates its efficacy and potential to improve detection accuracy. More importantly, by utilizing several classifiers with diverse characteristics, their strengths can be harnessed not only for enhanced Android malware detection but also quicker white box analysis by means of the more interpretable constituent classifiers.
Resumo:
Developed countries, led by the EU and the US, have consistently called for ‘deeper integration’ over the course of the past three decades i.e., the convergence of ‘behind-the-border’ or domestic polices and rules such as services, competition, public procurement, intellectual property (“IP”) and so forth. Following the collapse of the Doha Development Round, the EU and the US have pursued this push for deeper integration by entering into deep and comprehensive free trade agreements (“DCFTAs”) that are comprehensive insofar as they are not limited to tariffs but extend to regulatory trade barriers. More recently, the EU and the US launched negotiations on a Transatlantic Trade and Investment Partnership (“TTIP”) and a Trade in Services Agreement (“TISA”), which put tackling barriers resulting from divergences in domestic regulation in the area of services at the very top of the agenda. Should these agreements come to pass, they may well set the template for the rules of international trade and define the core features of domestic services market regulation. This article examines the regulatory disciplines in the area of services included in existing EU and US DCFTAs from a comparative perspective in order to delineate possible similarities and divergences and assess the extent to which these DCFTAs can shed some light into the possible outcome and limitations of future trade negotiations in services. It also discusses the potential impact of such negotiations on developing countries and, more generally, on the multilateral process.
Resumo:
The plain fatigue and fretting fatigue tests of Ti-1023 titanium alloy were performed using a high-frequency push-pull fatigue testing machine. Both σmax versus number of cycles to failure curves were obtained for comparative analysis of the fretting effect on fatigue performance of the titanium alloy. Meanwhile, by analyzing the fracture of plain fatigue and fretting fatigue, the fretting scar and the fretting debris observed by scanning electron microscopy (SEM), the mechanism of fretting fatigue failure of Ti-1023 titanium alloy is discussed. The fretting fatigue strength of Ti-1023 titanium alloy is 175 MPa under 10 MPa contact pressure, which is 21% of plain fatigue strength (836 MPa). Under fretting condition, the Ti-1023 titanium alloy fatigue fracture failure occurs in a shorter fatigue life. When it comes to σmax versus number of cycles to failure curves, data points in the range of 106–107 cycles under plain fatigue condition moved to the range of 105–106 under fretting fatigue condition. The integrity of the fatigue specimen surface was seriously damaged under the effect of fretting. With the alternating stress loaded on specimen, the stress concentrated on the surface of fretting area, which brought earlier the initiation and propagation of crack.
Resumo:
Background: Clinical Commissioning Groups (CCGs) are mandated to use research evidence effectively to ensure optimum use of resources by the National Health Service (NHS), both in accelerating innovation and in stopping the use of less effective practices and models of service delivery. We intend to evaluate whether access to a demand-led evidence service improves uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives.
Methods/design: This is a controlled before and after study involving CCGs in the North of England. Participating CCGs will receive one of three interventions to support the use of research evidence in their decision-making:1) consulting plus responsive push of tailored evidence; 2) consulting plus an unsolicited push of non-tailored evidence; or 3) standard service unsolicited push of non-tailored evidence. Our primary outcome will be changed at 12 months from baseline of a CCGs ability to acquire, assess, adapt and apply research evidence to support decision-making. Secondary outcomes will measure individual clinical leads and managers’ intentions to use research evidence in decision making. Documentary evidence of the use of the outputs of the service will be sought. A process evaluation will evaluate the nature and success of the interactions both within the sites and between commissioners and researchers delivering the service.
Discussion: The proposed research will generate new knowledge of direct relevance and value to the NHS. The findings will help to clarify which elements of the service are of value in promoting the use of research evidence.Those involved in NHS commissioning will be able to use the results to inform how best to build the infrastructure they need to acquire, assess, adapt and apply research evidence to support decision-making and to fulfil their statutory duties under the Health and Social Care Act.
Resumo:
This article tackles the abundance of inconsistent terminologies that surround the discourse on practice and research. The text builds on recent debates on creative practice and education, sparked through the EU funded project SHARE. I argue that a shift in contemporary continental philosophy in the 1970s, which nudged the body into a more central position, allowed for creative practice and with it ‘embodied knowing’ to slowly push open the doors of the academies. I will show that practice today is already well embedded in some UK institutions, and I put forward that rather than thinking of an apologetic Practice as..., Performance as .., we should refer more resolutely to what I here term ‘Practice Research’. I demystify notions of validation of creative practice by re-emphasising the artistic qualities of ‘integrity, sincerity and authenticity’, borrowed from the 2013 BBC Reith lecturer and artist/potter Grayson Perry.
Resumo:
Mobile malware has been growing in scale and complexity as smartphone usage continues to rise. Android has surpassed other mobile platforms as the most popular whilst also witnessing a dramatic increase in malware targeting the platform. A worrying trend that is emerging is the increasing sophistication of Android malware to evade detection by traditional signature-based scanners. As such, Android app marketplaces remain at risk of hosting malicious apps that could evade detection before being downloaded by unsuspecting users. Hence, in this paper we present an effective approach to alleviate this problem based on Bayesian classification models obtained from static code analysis. The models are built from a collection of code and app characteristics that provide indicators of potential malicious activities. The models are evaluated with real malware samples in the wild and results of experiments are presented to demonstrate the effectiveness of the proposed approach.
Resumo:
This article examines the nature of gender politics in Northern Ireland since the 1998 Good Friday/Belfast Agreement. Taking gender justice as a normative democratic framework, the article argues that despite the promise of women's equal participation in public and political life written into the Agreement, parties have delivered varied responses to integrating women, women's interests and perspectives into politics and policy platforms. This contrasts with general patterns supporting women's increased participation in social and political life. The article discusses women's descriptive and substantive representation through electoral outcomes and party manifestos, using the demands of successive women's manifestos as a benchmark. It concludes that while parties have given less recognition and inclusion to women than one might have expected in a new political context, the push for democratic accountability will ensure that gender politics will continue to have a place on the political agenda for some time to come.
Resumo:
Recent studies predict elevated and accelerating rates of species extinctions over the 21st century, due to climate change and habitat loss. Considering that such primary species loss may initiate cascades of secondary extinctions and push systems towards critical tipping points, we urgently need to increase our understanding of if certain sequences of species extinctions can be expected to be more devastating than others Most theoretical studies addressing this question have used a topological (non-dynamical) approach to analyse the probability that food webs will collapse, below a fixed threshold value in species richness, when subjected to different sequences of species loss. Typically, these studies have neither considered the possibility of dynamical responses of species, nor that conclusions may depend on the value of the collapse threshold. Here we analyse how sensitive conclusions on the importance of different species are to the threshold value of food web collapse. Using dynamical simulations, where we expose model food webs to a range of extinction sequences, we evaluate the reliability of the most frequently used index, R<inf>50</inf>, as a measure of food web robustness. In general, we find that R<inf>50</inf> is a reliable measure and that identification of destructive deletion sequences is fairly robust, within a moderate range of collapse thresholds. At the same time, however, focusing on R<inf>50</inf> only hides a lot of interesting information on the disassembly process and can, in some cases, lead to incorrect conclusions on the relative importance of species in food webs.
Resumo:
With over 50 billion downloads and more than 1.3 million apps in Google’s official market, Android has continued to gain popularity amongst smartphone users worldwide. At the same time there has been a rise in malware targeting the platform, with more recent strains employing highly sophisticated detection avoidance techniques. As traditional signature based methods become less potent in detecting unknown malware, alternatives are needed for timely zero-day discovery. Thus this paper proposes an approach that utilizes ensemble learning for Android malware detection. It combines advantages of static analysis with the efficiency and performance of ensemble machine learning to improve Android malware detection accuracy. The machine learning models are built using a large repository of malware samples and benign apps from a leading antivirus vendor. Experimental results and analysis presented shows that the proposed method which uses a large feature space to leverage the power of ensemble learning is capable of 97.3 % to 99% detection accuracy with very low false positive rates.
Resumo:
The battle to mitigate Android malware has become more critical with the emergence of new strains incorporating increasingly sophisticated evasion techniques, in turn necessitating more advanced detection capabilities. Hence, in this paper we propose and evaluate a machine learning based approach based on eigenspace analysis for Android malware detection using features derived from static analysis characterization of Android applications. Empirical evaluation with a dataset of real malware and benign samples show that detection rate of over 96% with a very low false positive rate is achievable using the proposed method.
Resumo:
Conventional practice in Regional Geochemistry includes as a final step of any geochemical campaign the generation of a series of maps, to show the spatial distribution of each of the components considered. Such maps, though necessary, do not comply with the compositional, relative nature of the data, which unfortunately make any conclusion based on them sensitive
to spurious correlation problems. This is one of the reasons why these maps are never interpreted isolated. This contribution aims at gathering a series of statistical methods to produce individual maps of multiplicative combinations of components (logcontrasts), much in the flavor of equilibrium constants, which are designed on purpose to capture certain aspects of the data.
We distinguish between supervised and unsupervised methods, where the first require an external, non-compositional variable (besides the compositional geochemical information) available in an analogous training set. This external variable can be a quantity (soil density, collocated magnetics, collocated ratio of Th/U spectral gamma counts, proportion of clay particle fraction, etc) or a category (rock type, land use type, etc). In the supervised methods, a regression-like model between the external variable and the geochemical composition is derived in the training set, and then this model is mapped on the whole region. This case is illustrated with the Tellus dataset, covering Northern Ireland at a density of 1 soil sample per 2 square km, where we map the presence of blanket peat and the underlying geology. The unsupervised methods considered include principal components and principal balances
(Pawlowsky-Glahn et al., CoDaWork2013), i.e. logcontrasts of the data that are devised to capture very large variability or else be quasi-constant. Using the Tellus dataset again, it is found that geological features are highlighted by the quasi-constant ratios Hf/Nb and their ratio against SiO2; Rb/K2O and Zr/Na2O and the balance between these two groups of two variables; the balance of Al2O3 and TiO2 vs. MgO; or the balance of Cr, Ni and Co vs. V and Fe2O3. The largest variability appears to be related to the presence/absence of peat.