937 resultados para Optimal matching analysis.
Resumo:
SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.
Resumo:
The graph Laplacian operator is widely studied in spectral graph theory largely due to its importance in modern data analysis. Recently, the Fourier transform and other time-frequency operators have been defined on graphs using Laplacian eigenvalues and eigenvectors. We extend these results and prove that the translation operator to the i’th node is invertible if and only if all eigenvectors are nonzero on the i’th node. Because of this dependency on the support of eigenvectors we study the characteristic set of Laplacian eigenvectors. We prove that the Fiedler vector of a planar graph cannot vanish on large neighborhoods and then explicitly construct a family of non-planar graphs that do exhibit this property. We then prove original results in modern analysis on graphs. We extend results on spectral graph wavelets to create vertex-dyanamic spectral graph wavelets whose support depends on both scale and translation parameters. We prove that Spielman’s Twice-Ramanujan graph sparsifying algorithm cannot outperform his conjectured optimal sparsification constant. Finally, we present numerical results on graph conditioning, in which edges of a graph are rescaled to best approximate the complete graph and reduce average commute time.
Resumo:
Cultivation of chilling-tolerant ornamental crops at lower temperature could reduce the energy demands of heated greenhouses. To provide a better understanding of how sub-optimal temperatures (12 degrees C vs. 16 degrees C) affect growth of the sensitive Petunia hybrida cultivar 'SweetSunshine Williams', the transcriptome, carbohydrate metabolism, and phytohormone homeostasis were monitored in aerial plant parts over 4 weeks by use of a microarray, enzymatic assays and GC-MS/MS. The data revealed three consecutive phases of chilling response. The first days were marked by a strong accumulation of sugars, particularly in source leaves, preferential up-regulation of genes in the same tissue and down-regulation of several genes in the shoot apex, especially those involved in the abiotic stress response. The midterm phase featured a partial normalization of carbohydrate levels and gene expression. After 3 weeks of chilling exposure, a new stabilized balance was established. Reduced hexose levels in the shoot apex, reduced ratios of sugar levels between the apex and source leaves and a higher apical sucrose/hexose ratio, associated with decreased activity and expression of cell wall invertase, indicate that prolonged chilling induced sugar accumulation in source leaves at the expense of reduced sugar transport to and reduced sucrose utilization in the shoot. This was associated with reduced levels of indole-3-acetic acid and abscisic acid in the apex and high numbers of differentially, particularly up-regulated genes, especially in the source leaves, including those regulating histones, ethylene action, transcription factors, and a jasmonate-ZIM-domain protein. Transcripts of one Jumonji C domain containing protein and one expansin accumulated in source leaves throughout the chilling period. The results reveal a dynamic and complex disturbance of plant function in response to mild chilling, opening new perspectives for the comparative analysis of differently tolerant cultivars.
Resumo:
This thesis is an examination of the ASEAN’s prospects in establishing regional competition policy in the Southeast Asia region, a topic of contemporary relevance in light of the ASEAN’s recent foray into the economic integration field on 31 December 2015. It questions whether the current approach undertaken by the ASEAN could contribute to an effective regional competition policy under the regional market integration. In answering this question, the thesis first critically surveys the current terrain of regional competition laws and policies in order to determine the possible existence of an optimal template. It argues that although the EU model is oft used as a source of inspiration, each regional organisation conceives different configurations of the model in order to best adjust to the local regional contexts. The thesis makes an inquiry into the narratives of the ASEAN’s competition policy, as well as the ASEAN’s specific considerations in the development of competition policy, before comparing the findings to the actual approaches taken by the ASEAN in its pursuit of regional competition policy. This thesis reveals that the actual approach taken by the ASEAN demonstrates an important discrepancy from the economic integration goal. The ASEAN applies a soft harmonisation approach regarding substantive competition law while refraining from establishing a centralised institution or a representative institution. The sole organ with regards to competition policy at the regional level is an expert organ. The thesis also conducts an investigation into the reception of the ASEAN’s regional policy by the member states in order to ascertain the possibility of the achievement of the ASEAN’s aspiration of regional competition policy. The study reveals that despite some shared similarities in the broad principles of competition law amongst the member states, the various competition law regimes are not harmonised thus creating challenging obstacle to the ASEAN’s ambition. The thesis then concludes that the ASEAN’s approach to regional competition law is unlikely to be effective.
Resumo:
Cranial cruciate ligament (CCL) deficiency is the leading cause of lameness affecting the stifle joints of large breed dogs, especially Labrador Retrievers. Although CCL disease has been studied extensively, its exact pathogenesis and the primary cause leading to CCL rupture remain controversial. However, weakening secondary to repetitive microtrauma is currently believed to cause the majority of CCL instabilities diagnosed in dogs. Techniques of gait analysis have become the most productive tools to investigate normal and pathological gait in human and veterinary subjects. The inverse dynamics analysis approach models the limb as a series of connected linkages and integrates morphometric data to yield information about the net joint moment, patterns of muscle power and joint reaction forces. The results of these studies have greatly advanced our understanding of the pathogenesis of joint diseases in humans. A muscular imbalance between the hamstring and quadriceps muscles has been suggested as a cause for anterior cruciate ligament rupture in female athletes. Based on these findings, neuromuscular training programs leading to a relative risk reduction of up to 80% has been designed. In spite of the cost and morbidity associated with CCL disease and its management, very few studies have focused on the inverse dynamics gait analysis of this condition in dogs. The general goals of this research were (1) to further define gait mechanism in Labrador Retrievers with and without CCL-deficiency, (2) to identify individual dogs that are susceptible to CCL disease, and (3) to characterize their gait. The mass, location of the center of mass (COM), and mass moment of inertia of hind limb segments were calculated using a noninvasive method based on computerized tomography of normal and CCL-deficient Labrador Retrievers. Regression models were developed to determine predictive equations to estimate body segment parameters on the basis of simple morphometric measurements, providing a basis for nonterminal studies of inverse dynamics of the hind limbs in Labrador Retrievers. Kinematic, ground reaction forces (GRF) and morphometric data were combined in an inverse dynamics approach to compute hock, stifle and hip net moments, powers and joint reaction forces (JRF) while trotting in normal, CCL-deficient or sound contralateral limbs. Reductions in joint moment, power, and loads observed in CCL-deficient limbs were interpreted as modifications adopted to reduce or avoid painful mobilization of the injured stifle joint. Lameness resulting from CCL disease affected predominantly reaction forces during the braking phase and the extension during push-off. Kinetics also identified a greater joint moment and power of the contralateral limbs compared with normal, particularly of the stifle extensor muscles group, which may correlate with the lameness observed, but also with the predisposition of contralateral limbs to CCL deficiency in dogs. For the first time, surface EMG patterns of major hind limb muscles during trotting gait of healthy Labrador Retrievers were characterized and compared with kinetic and kinematic data of the stifle joint. The use of surface EMG highlighted the co-contraction patterns of the muscles around the stifle joint, which were documented during transition periods between flexion and extension of the joint, but also during the flexion observed in the weight bearing phase. Identification of possible differences in EMG activation characteristics between healthy patients and dogs with or predisposed to orthopedic and neurological disease may help understanding the neuromuscular abnormality and gait mechanics of such disorders in the future. Conformation parameters, obtained from femoral and tibial radiographs, hind limb CT images, and dual-energy X-ray absorptiometry, of hind limbs predisposed to CCL deficiency were compared with the conformation parameters from hind limbs at low risk. A combination of tibial plateau angle and femoral anteversion angle measured on radiographs was determined optimal for discriminating predisposed and non-predisposed limbs for CCL disease in Labrador Retrievers using a receiver operating characteristic curve analysis method. In the future, the tibial plateau angle (TPA) and femoral anteversion angle (FAA) may be used to screen dogs suspected of being susceptible to CCL disease. Last, kinematics and kinetics across the hock, stifle and hip joints in Labrador Retrievers presumed to be at low risk based on their radiographic TPA and FAA were compared to gait data from dogs presumed to be predisposed to CCL disease for overground and treadmill trotting gait. For overground trials, extensor moment at the hock and energy generated around the hock and stifle joints were increased in predisposed limbs compared to non predisposed limbs. For treadmill trials, dogs qualified as predisposed to CCL disease held their stifle at a greater degree of flexion, extended their hock less, and generated more energy around the stifle joints while trotting on a treadmill compared with dogs at low risk. This characterization of the gait mechanics of Labrador Retrievers at low risk or predisposed to CCL disease may help developing and monitoring preventive exercise programs to decrease gastrocnemius dominance and strengthened the hamstring muscle group.
Resumo:
ABSTRACT Researchers frequently have to analyze scales in which some participants have failed to respond to some items. In this paper we focus on the exploratory factor analysis of multidimensional scales (i.e., scales that consist of a number of subscales) where each subscale is made up of a number of Likert-type items, and the aim of the analysis is to estimate participants' scores on the corresponding latent traits. We propose a new approach to deal with missing responses in such a situation that is based on (1) multiple imputation of non-responses and (2) simultaneous rotation of the imputed datasets. We applied the approach in a real dataset where missing responses were artificially introduced following a real pattern of non-responses, and a simulation study based on artificial datasets. The results show that our approach (specifically, Hot-Deck multiple imputation followed of Consensus Promin rotation) was able to successfully compute factor score estimates even for participants that have missing data.
Resumo:
This dissertation research points out major challenging problems with current Knowledge Organization (KO) systems, such as subject gateways or web directories: (1) the current systems use traditional knowledge organization systems based on controlled vocabulary which is not very well suited to web resources, and (2) information is organized by professionals not by users, which means it does not reflect intuitively and instantaneously expressed users’ current needs. In order to explore users’ needs, I examined social tags which are user-generated uncontrolled vocabulary. As investment in professionally-developed subject gateways and web directories diminishes (support for both BUBL and Intute, examined in this study, is being discontinued), understanding characteristics of social tagging becomes even more critical. Several researchers have discussed social tagging behavior and its usefulness for classification or retrieval; however, further research is needed to qualitatively and quantitatively investigate social tagging in order to verify its quality and benefit. This research particularly examined the indexing consistency of social tagging in comparison to professional indexing to examine the quality and efficacy of tagging. The data analysis was divided into three phases: analysis of indexing consistency, analysis of tagging effectiveness, and analysis of tag attributes. Most indexing consistency studies have been conducted with a small number of professional indexers, and they tended to exclude users. Furthermore, the studies mainly have focused on physical library collections. This dissertation research bridged these gaps by (1) extending the scope of resources to various web documents indexed by users and (2) employing the Information Retrieval (IR) Vector Space Model (VSM) - based indexing consistency method since it is suitable for dealing with a large number of indexers. As a second phase, an analysis of tagging effectiveness with tagging exhaustivity and tag specificity was conducted to ameliorate the drawbacks of consistency analysis based on only the quantitative measures of vocabulary matching. Finally, to investigate tagging pattern and behaviors, a content analysis on tag attributes was conducted based on the FRBR model. The findings revealed that there was greater consistency over all subjects among taggers compared to that for two groups of professionals. The analysis of tagging exhaustivity and tag specificity in relation to tagging effectiveness was conducted to ameliorate difficulties associated with limitations in the analysis of indexing consistency based on only the quantitative measures of vocabulary matching. Examination of exhaustivity and specificity of social tags provided insights into particular characteristics of tagging behavior and its variation across subjects. To further investigate the quality of tags, a Latent Semantic Analysis (LSA) was conducted to determine to what extent tags are conceptually related to professionals’ keywords and it was found that tags of higher specificity tended to have a higher semantic relatedness to professionals’ keywords. This leads to the conclusion that the term’s power as a differentiator is related to its semantic relatedness to documents. The findings on tag attributes identified the important bibliographic attributes of tags beyond describing subjects or topics of a document. The findings also showed that tags have essential attributes matching those defined in FRBR. Furthermore, in terms of specific subject areas, the findings originally identified that taggers exhibited different tagging behaviors representing distinctive features and tendencies on web documents characterizing digital heterogeneous media resources. These results have led to the conclusion that there should be an increased awareness of diverse user needs by subject in order to improve metadata in practical applications. This dissertation research is the first necessary step to utilize social tagging in digital information organization by verifying the quality and efficacy of social tagging. This dissertation research combined both quantitative (statistics) and qualitative (content analysis using FRBR) approaches to vocabulary analysis of tags which provided a more complete examination of the quality of tags. Through the detailed analysis of tag properties undertaken in this dissertation, we have a clearer understanding of the extent to which social tagging can be used to replace (and in some cases to improve upon) professional indexing.
Resumo:
Kidney transplantation has been recognised as the optimal treatment choice for most end stage renal disease patients and the increase of allograft survival rates is achieved through the refinement of novel immunosuppressive agents. Chronic Graft Disease (CGD) is a multifactorial process that likely includes a combination of immunological, apoptotic and inflammatory factors. The application of individualised immunosuppressive therapies will also depend on the identification of risk factors that can influence chronic disease. Despite being the subject of several independent studies, investigations of the relationship between transforming growth factor-b1 (TGF-b1) polymorphisms and kidney graft outcome continue to be plagued by contradictory conclusions.
Resumo:
Marketization has changed the education system. If we say that education is a market, this transforms the understanding of education and influences how people act. In this paper, adult-education school-leaders’ talk is analysed and seven metaphors for education are found: education as administration, market, matching, democracy, policy work, integration and learning. Exploring empirical metaphors provides a rich illustration of coinciding meanings. In line with studies on policy texts, economic metaphors are found to dominate. This should be understood not only as representing liberal ideology, as is often discussed in analyses of policy papers, but also as representing economic theory. In other words, contemporary adult education can be understood as driven by economic theories. The difference and relation between ideology and theory should be further examined since they have an impact on our society and on our everyday lives. (DIPF/Orig.)
Resumo:
The selection of the optimal operating conditions for an industrial acrylonitrile recovery unit was conducted by the systematic application of the response surface methodology, based on the minimum energy consumption and products specifications as process constraints. Unit models and plant simulation were validated against operating data and information. A sensitivity analysis was carried out in order to identify the set of parameters that strongly affect the trajectories of the system while keeping products specifications. The results suggest that energy savings of up to 10% are possible by systematically adjusting operating conditions.
Resumo:
The square root velocity framework is a method in shape analysis to define a distance between curves and functional data. Identifying two curves, if the differ by a reparametrization leads to the quotient space of unparametrized curves. In this paper we study analytical and topological aspects of this construction for the class of absolutely continuous curves. We show that the square root velocity transform is a homeomorphism and that the action of the reparametrization semigroup is continuous. We also show that given two $C^1$-curves, there exist optimal reparametrizations realising the minimal distance between the unparametrized curves represented by them.
Resumo:
PV energy is the direct conversion of solar radiation into electricity. In this paper, an analysis of the influence of parameters such as global irradiance or temperature in the performance of a PV installation has been carried out. A PV module was installed in a building at the University of Málaga, and these parameters were experimentally determined for different days and different conditions of irradiance and temperature. Moreover, IV curves were obtained under these conditions to know the open-circuit voltage and the short-circuit current of the module. With this information, and using the first law of thermodynamics, an energy analysis was performed to determine the energy efficiency of the installation. Similarly, using the second law of thermodynamics, an exergy analysis is used to obtain the exergy efficiency. The results show that the energy efficiency varies between 10% and 12% and the exergy efficiency between 14% and 17%. It was concluded that the exergy analysis is more suitable for studying the performance, and that only electric exergy must be considered as useful exergy. This exergy efficiency can be improved if heat is removed from the PV module surface, and an optimal temperature is reached.
Resumo:
Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.
Resumo:
This paper presents the results of a research that aimed at identifying optimal performance standards of Brazilian public and philanthropic hospitals. In order to carry out the analysis, a model based on Data Envelopment Analysis (DEA) was developed. We collected financial data from hospitals’ financial statements available on the internet, as well as operational data from the Information Technology Department of the Brazilian Public Health Care System – SUS (DATASUS). Data from 18 hospitals from 2007 to 2011 were analyzed. Our DEA model used both operational and financial indicators (variables). In order to develop this model, two indicators were considered inputs: Values (in Brazilian Reais) of Fixed Assets and Planned Capacity. On the other hand, the following indicators were considered outputs: Net Margin, Return on Assets and Institutional Mortality Rate. As regards the proposed model, there were five hospitals with optimal performance and four hospitals were considered inefficient, upon the analysis of the variables, considering the analyzed period. Analysis of the weights indicated the most relevant variables for determining efficiency and scale variable values, which is an important tool to aid the decision-making by hospital managers. Finally, the scale variables determined the returns on production, indicating that 14 hospitals work with scale diseconomies. This may indicate inefficiency in the resource management of the Brazilian public health-care system, by analyzing this set of proposed variables.
Resumo:
The main goal of this paper is to analyse the sensitivity of a vector convex optimization problem according to variations in the right-hand side. We measure the quantitative behavior of a certain set of Pareto optimal points characterized to become minimum when the objective function is composed with a positive function. Its behavior is analysed quantitatively using the circatangent derivative for set-valued maps. Particularly, it is shown that the sensitivity is closely related to a Lagrange multiplier solution of a dual program.