894 resultados para Numerical approximation and analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the European Union has come to view cyber security, and in particular, cyber crime as one of the most relevant challenges to the completion of its Area of Freedom, Security and Justice. Given European societies’ increased reliance on borderless and decentralized information technologies, this sector of activity has been identified as an easy target for actors such as organised criminals, hacktivists or terrorist networks. Such analysis has been accompanied by EU calls to step up the fight against unlawful online activities, namely through increased cooperation among law enforcement authorities (both national and extra- communitarian), the approximation of legislations, and public- private partnerships. Although EU initiatives in this field have, so far, been characterized by a lack of interconnection and an integrated strategy, there has been, since the mid- 2000s, an attempt to develop a more cohesive and coordinated policy. An important part of this policy is connected to the activities of Europol, which have come to assume a central role in the coordination of intelligence gathering and analysis of cyber crime. The European Cybercrime Center (EC3), which will become operational within Europol in January 2013, is regarded, in particular, as a focal point of the EU’s fight against this phenomenon. Bearing this background in mind, the present article wishes to understand the role of Europol in the development of a European policy to counter the illegal use of the internet. The article proposes to reach this objective by analyzing, through the theoretical lenses of experimental governance, the evolution of this agency’s activities in the area of cyber crime and cyber security, its positioning as an expert in the field, and the consequences for the way this policy is currently developing and is expected to develop in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The central argument to this thesis is that the nature and purpose of corporate reporting has changed over time to become a more outward looking and forward looking document designed to promote the company and its performance to a wide range of shareholders, rather than merely to report to its owners upon past performance. it is argued that the discourse of environmental accounting and reporting is one driver for this change but that this discourse has been set up as in conflicting with the discourse of traditional accounting and performance measurement. The effect of this opposition between the discourses is that the two have been interpreted to be different and incompatible dimensions of performance with good performance along one dimension only being achievable through a sacrifice of performance along the other dimension. Thus a perceived dialectic in performance is believed to exist. One of the principal purposes of this thesis is to explore this perceived dialectic and, through analysis, to show that it does not exist and that there is not incompatibility. This exploration and analysis is based upon an investigation of the inherent inconsistencies in such corporate reports and the analysis makes use of both a statistical analysis and a semiotic analysis of corporate reports and the reported performance of companies along these dimensions. Thus the development of a semiology of corporate reporting is one of the significant outcomes of this thesis. A further outcome is a consideration of the implications of the analysis for corporate performance and its measurement. The thesis concludes with a consideration of the way in which the advent of electronic reporting may affect the ability of organisations to maintain the dialectic and the implications for corporate reporting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study considers the application of image analysis in petrography and investigates the possibilities for advancing existing techniques by introducing feature extraction and analysis capabilities of a higher level than those currently employed. The aim is to construct relevant, useful descriptions of crystal form and inter-crystal relations in polycrystalline igneous rock sections. Such descriptions cannot be derived until the `ownership' of boundaries between adjacent crystals has been established: this is the fundamental problem of crystal boundary assignment. An analysis of this problem establishes key image features which reveal boundary ownership; a set of explicit analysis rules is presented. A petrographic image analysis scheme based on these principles is outlined and the implementation of key components of the scheme considered. An algorithm for the extraction and symbolic representation of image structural information is developed. A new multiscale analysis algorithm which produces a hierarchical description of the linear and near-linear structure on a contour is presented in detail. Novel techniques for symmetry analysis are developed. The analyses considered contribute both to the solution of the boundary assignment problem and to the construction of geologically useful descriptions of crystal form. The analysis scheme which is developed employs grouping principles such as collinearity, parallelism, symmetry and continuity, so providing a link between this study and more general work in perceptual grouping and intermediate level computer vision. Consequently, the techniques developed in this study may be expected to find wider application beyond the petrographic domain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis includes analysis of disordered spin ensembles corresponding to Exact Cover, a multi-access channel problem, and composite models combining sparse and dense interactions. The satisfiability problem in Exact Cover is addressed using a statistical analysis of a simple branch and bound algorithm. The algorithm can be formulated in the large system limit as a branching process, for which critical properties can be analysed. Far from the critical point a set of differential equations may be used to model the process, and these are solved by numerical integration and exact bounding methods. The multi-access channel problem is formulated as an equilibrium statistical physics problem for the case of bit transmission on a channel with power control and synchronisation. A sparse code division multiple access method is considered and the optimal detection properties are examined in typical case by use of the replica method, and compared to detection performance achieved by interactive decoding methods. These codes are found to have phenomena closely resembling the well-understood dense codes. The composite model is introduced as an abstraction of canonical sparse and dense disordered spin models. The model includes couplings due to both dense and sparse topologies simultaneously. The new type of codes are shown to outperform sparse and dense codes in some regimes both in optimal performance, and in performance achieved by iterative detection methods in finite systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent novel approach to the visualisation and analysis of datasets, and one which is particularly applicable to those of a high dimension, is discussed in the context of real applications. A feed-forward neural network is utilised to effect a topographic, structure-preserving, dimension-reducing transformation of the data, with an additional facility to incorporate different degrees of associated subjective information. The properties of this transformation are illustrated on synthetic and real datasets, including the 1992 UK Research Assessment Exercise for funding in higher education. The method is compared and contrasted to established techniques for feature extraction, and related to topographic mappings, the Sammon projection and the statistical field of multidimensional scaling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Bipolar disorder requires long-term treatment but non-adherence is a common problem. Antipsychotic long-acting injections (LAIs) have been suggested to improve adherence but none are licensed in the UK for bipolar. However, the use of second-generation antipsychotics (SGA) LAIs in bipolar is not uncommon albeit there is a lack of systematic review in this area. This study aims to systematically review safety and efficacy of SGA LAIs in the maintenance treatment of bipolar disorder. METHODS AND ANALYSIS: The protocol is based on Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) and will include only randomised controlled trials comparing SGA LAIs in bipolar. PubMed, EMBASE, CINAHL, Cochrane Library (CENTRAL), PsychINFO, LiLACS, http://www.clinicaltrials.gov will be searched, with no language restriction, from 2000 to January 2016 as first SGA LAIs came to the market after 2000. Manufacturers of SGA LAIs will also be contacted. Primary efficacy outcome is relapse rate or delayed time to relapse or reduction in hospitalisation and primary safety outcomes are drop-out rates, all-cause discontinuation and discontinuation due to adverse events. Qualitative reporting of evidence will be based on 21 items listed on standards for reporting qualitative research (SRQR) focusing on study quality (assessed using the Jadad score, allocation concealment and data analysis), risk of bias and effect size. Publication bias will be assessed using funnel plots. If sufficient data are available meta-analysis will be performed with primary effect size as relative risk presented with 95% CI. Sensitivity analysis, conditional on number of studies and sample size, will be carried out on manic versus depressive symptoms and monotherapy versus adjunctive therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a detailed numerical analysis, fabrication method and experimental investigation on 45º tilted fiber gratings (45º-TFGs) and excessively tilted fiber gratings (Ex-TFGs), and their applications in fiber laser and sensing systems. The one of the most significant contributions of the work reported in this thesis is that the 45º-TFGs with high polarization extinction ratio (PER) have been fabricated in single mode telecom and polarization maintaining (PM) fibers with spectral response covering three prominent optic communication and central wavelength ranges at 1060nm, 1310nm and 1550nm. The most achieved PERs for the 45º-TFGs are up to and greater than 35-50dB, which have reached and even exceeded many commercial in-fiber polarizers. It has been proposed that the 45º-TFGs of high PER can be used as ideal in-fiber polarizers for a wide range of fiber systems and applications. In addition, in-depth detailed theoretical models and analysis have been developed and systematic experimental evaluation has been conducted producing results in excellent agreement with theoretical modeling. Another important outcome of the research work is the proposal and demonstration of all fiber Lyot filters (AFLFs) implemented by utilizing two (for a single stage type) and more (for multi-stage) 45º-TFGs in PM fiber cavity structure. The detailed theoretical analysis and modelling of such AFLFs have also been carried out giving design guidance for the practical implementation. The unique function advantages of 45º-TFG based AFLFs have been revealed, showing high finesse multi-wavelength transmission of single polarization and wide range of tuneability. The temperature tuning results of AFLFs have shown that the AFLFs have 60 times higher thermal sensitivity than the normal FBGs, thus permitting thermal tuning rate of ~8nm/10ºC. By using an intra-cavity AFLF, an all fiber soliton mode locking laser with almost total suppression of siliton sidebands, single polarization output and single/multi-wavelength switchable operation has been demonstrated. The final significant contribution is the theoretical analysis and experimental verification on the design, fabrication and sensing application of Ex-TFGs. The Ex-TFG sensitivity model to the surrounding medium refractive index (SRI) has been developed for the first time, and the factors that affect the thermal and SRI sensitivity in relation to the wavelength range, tilt angle, and the size of cladding have been investigated. As a practical SRI sensor, an 81º-TFG UV-inscribed in the fiber with small (40μm) cladding radius has shown an SRI sensitivity up to 1180nm/RIU in the index of 1.345 range. Finally, to ensure single polarization detection in such an SRI sensor, a hybrid configuration by UV-inscribing a 45º-TFG and an 81º-TFG closely on the same piece of fiber has been demonstrated as a more advanced SRI sensing system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* Work supported by the Lithuanian State Science and Studies Foundation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Math. Subject Classification: 26A33; 33E12, 33E30, 44A15, 45J05

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fuzzy data envelopment analysis (DEA) models emerge as another class of DEA models to account for imprecise inputs and outputs for decision making units (DMUs). Although several approaches for solving fuzzy DEA models have been developed, there are some drawbacks, ranging from the inability to provide satisfactory discrimination power to simplistic numerical examples that handles only triangular fuzzy numbers or symmetrical fuzzy numbers. To address these drawbacks, this paper proposes using the concept of expected value in generalized DEA (GDEA) model. This allows the unification of three models - fuzzy expected CCR, fuzzy expected BCC, and fuzzy expected FDH models - and the ability of these models to handle both symmetrical and asymmetrical fuzzy numbers. We also explored the role of fuzzy GDEA model as a ranking method and compared it to existing super-efficiency evaluation models. Our proposed model is always feasible, while infeasibility problems remain in certain cases under existing super-efficiency models. In order to illustrate the performance of the proposed method, it is first tested using two established numerical examples and compared with the results obtained from alternative methods. A third example on energy dependency among 23 European Union (EU) member countries is further used to validate and describe the efficacy of our approach under asymmetric fuzzy numbers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: primary: 60J80, 60J85, secondary: 62M09, 92D40

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spamming has been a widespread problem for social networks. In recent years there is an increasing interest in the analysis of anti-spamming for microblogs, such as Twitter. In this paper we present a systematic research on the analysis of spamming in Sina Weibo platform, which is currently a dominant microblogging service provider in China. Our research objectives are to understand the specific spamming behaviors in Sina Weibo and find approaches to identify and block spammers in Sina Weibo based on spamming behavior classifiers. To start with the analysis of spamming behaviors we devise several effective methods to collect a large set of spammer samples, including uses of proactive honeypots and crawlers, keywords based searching and buying spammer samples directly from online merchants. We processed the database associated with these spammer samples and interestingly we found three representative spamming behaviors: Aggressive advertising, repeated duplicate reposting and aggressive following. We extract various features and compare the behaviors of spammers and legitimate users with regard to these features. It is found that spamming behaviors and normal behaviors have distinct characteristics. Based on these findings we design an automatic online spammer identification system. Through tests with real data it is demonstrated that the system can effectively detect the spamming behaviors and identify spammers in Sina Weibo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 65M06, 65M12.