951 resultados para vector quantization based Gaussian modeling


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Reliability and dependability modeling can be employed during many stages of analysis of a computing system to gain insights into its critical behaviors. To provide useful results, realistic models of systems are often necessarily large and complex. Numerical analysis of these models presents a formidable challenge because the sizes of their state-space descriptions grow exponentially in proportion to the sizes of the models. On the other hand, simulation of the models requires analysis of many trajectories in order to compute statistically correct solutions. This dissertation presents a novel framework for performing both numerical analysis and simulation. The new numerical approach computes bounds on the solutions of transient measures in large continuous-time Markov chains (CTMCs). It extends existing path-based and uniformization-based methods by identifying sets of paths that are equivalent with respect to a reward measure and related to one another via a simple structural relationship. This relationship makes it possible for the approach to explore multiple paths at the same time,· thus significantly increasing the number of paths that can be explored in a given amount of time. Furthermore, the use of a structured representation for the state space and the direct computation of the desired reward measure (without ever storing the solution vector) allow it to analyze very large models using a very small amount of storage. Often, path-based techniques must compute many paths to obtain tight bounds. In addition to presenting the basic path-based approach, we also present algorithms for computing more paths and tighter bounds quickly. One resulting approach is based on the concept of path composition whereby precomputed subpaths are composed to compute the whole paths efficiently. Another approach is based on selecting important paths (among a set of many paths) for evaluation. Many path-based techniques suffer from having to evaluate many (unimportant) paths. Evaluating the important ones helps to compute tight bounds efficiently and quickly.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Normal grain growth of calcite was investigated by combining grain size analysis of calcite across the contact aureole of the Adamello pluton, and grain growth modeling based on a thermal model of the surroundings of the pluton. In an unbiased model system, i.e., location dependent variations in temperature-time path, 2/3 and 1/3 of grain growth occurs during pro- and retrograde metamorphism at all locations, respectively. In contrast to this idealized situation, in the field example three groups can be distinguished, which are characterized by variations in their grain size versus temperature relationships: Group I occurs at low temperatures and the grain size remains constant because nano-scale second phase particles of organic origin inhibit grain growth in the calcite aggregates under these conditions. In the presence of an aqueous fluid, these second phases decay at a temperature of about 350 °C enabling the onset of grain growth in calcite. In the following growth period, fluid-enhanced group II and slower group III growth occurs. For group II a continuous and intense grain size increase with T is typical while the grain growth decreases with T for group III. None of the observed trends correlate with experimentally based grain growth kinetics, probably due to differences between nature and experiment which have not yet been investigated (e.g., porosity, second phases). Therefore, grain growth modeling was used to iteratively improve the correlation between measured and modeled grain sizes by optimizing activation energy (Q), pre-exponential factor (k0) and grain size exponent (n). For n=2, Q of 350 kJ/mol, k0 of 1.7×1021 μmns−1 and Q of 35 kJ/mol, k0 of 2.5×10-5 μmns−1 were obtained for group II and III, respectively. With respect to future work, field-data based grain growth modeling might be a promising tool for investigating the influences of secondary effects like porosity and second phases on grain growth in nature, and to unravel differences between nature and experiment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The vapor liquid-equilibrium of water + ionic liquids is relevant for a wide range of applications of these compounds. It is usually measured by ebulliometric techniques, but these are time consuming and expensive. In this work it is shown that the activity coefficients of water in a series of cholinium-based ionic liquids can be reliably and quickly estimated at 298.15K using a humidity meter instrument. The cholinium based ionic liquids were chosen to test this experimental methodology since data for water activities of quaternary ammonium salts are available in the literature allowing the validation of the proposed technique. The COSMO-RS method provides a reliable description of the data and was also used to understand the molecular interactions occurring on these binary systems. The estimated excess enthalpies indicate that hydrogen bonding between water and ionic liquid anion is the dominant interaction that governs the behavior of water and cholinium-based ionic liquids systems, while the electrostatic-misfit and van der Walls forces have a minor contribution to the total excess enthalpies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Departamento de Administração, Programa de Pós-graduação em Administração, 2016.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system’s EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter’s components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Support Vector Machines (SVMs) are widely used classifiers for detecting physiological patterns in Human-Computer Interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the application of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables, and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Today, the contribution of the transportation sector on greenhouse gases is evident. The fast consumption of fossil fuels and its impact on the environment has given a strong impetus to the development of vehicles with better fuel economy. Hybrid electric vehicles fit into this context with different targets, starting from the reduction of emissions and fuel consumption, but also for performance and comfort enhancement. Vehicles exist with various missions; super sport cars usually aim to reach peak performance and to guarantee a great driving experience to the driver, but great attention must also be paid to fuel consumption. According to the vehicle mission, hybrid vehicles can differ in the powertrain configuration and the choice of the energy storage system. Lamborghini has recently invested in the development of hybrid super sport cars, due to performance and comfort reasons, with the possibility to reduce fuel consumption. This research activity has been conducted as a joint collaboration between the University of Bologna and the sportscar manufacturer, to analyze the impact of innovative energy storage solutions on the hybrid vehicle performance. Capacitors have been studied and modeled to analyze the pros and cons of such solution with respect to batteries. To this aim, a full simulation environment has been developed and validated to provide a concept design tool capable of precise results and able to foresee the longitudinal performance on regulated emission cycles and real driving conditions, with a focus on fuel consumption. In addition, the target of the research activity is to deepen the study of hybrid electric super sports cars in the concept development phase, focusing on defining the control strategies and the energy storage system’s technology that best suits the needs of the vehicles. This dissertation covers the key steps that have been carried out in the research project.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Vaults are an architectural element which during construction history have been built with a great variety of different materials, shapes, and sizes. The shape of these structural elements was often dependent by the necessity to cover complex spaces, by the needed loading capacity, or by architectural aesthetics. Within this complex scenario masonry patterns generates also different effects on loading capacity, load percolation and stiffness of the structure. These effects were been extensively investigated, both with empirical observations and with modern numerical methods. While most of them focus on analyzing the load bearing capacity or the texture effect on vaulted structures, the aim of this analysis is to investigate on the effects of the variation of a single structural characteristic on the load percolation in the vault. Moreover, an additional purpose of the work is related to the coding of a parametrical model aiming at generating different masonry vaulted structures. Nevertheless, proposed script can generate different typology of vaulted structure basing on some structural characteristics, such as the span and the length to cover and the dimensions of the blocks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The coastal ocean is a complex environment with extremely dynamic processes that require a high-resolution and cross-scale modeling approach in which all hydrodynamic fields and scales are considered integral parts of the overall system. In the last decade, unstructured-grid models have been used to advance in seamless modeling between scales. On the other hand, the data assimilation methodologies to improve the unstructured-grid models in the coastal seas have been developed only recently and need significant advancements. Here, we link the unstructured-grid ocean modeling to the variational data assimilation methods. In particular, we show results from the modeling system SANIFS based on SHYFEM fully-baroclinic unstructured-grid model interfaced with OceanVar, a state-of-art variational data assimilation scheme adopted for several systems based on a structured grid. OceanVar implements a 3DVar DA scheme. The combination of three linear operators models the background error covariance matrix. The vertical part is represented using multivariate EOFs for temperature, salinity, and sea level anomaly. The horizontal part is assumed to be Gaussian isotropic and is modeled using a first-order recursive filter algorithm designed for structured and regular grids. Here we introduced a novel recursive filter algorithm for unstructured grids. A local hydrostatic adjustment scheme models the rapidly evolving part of the background error covariance. We designed two data assimilation experiments using SANIFS implementation interfaced with OceanVar over the period 2017-2018, one with only temperature and salinity assimilation by Argo profiles and the second also including sea level anomaly. The results showed a successful implementation of the approach and the added value of the assimilation for the active tracer fields. While looking at the broad basin, no significant improvements are highlighted for the sea level, requiring future investigations. Furthermore, a Machine Learning methodology based on an LSTM network has been used to predict the model SST increments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Rhodamine B (RB) has been successfully exploited in the synthesis of light harvesting systems, but since RB is prone to form dimers acting as quenchers for the fluorescence, high energy transfer efficiencies can be reached only when using bulky and hydrophobic counterions acting as spacers between RBs. In this PhD thesis, a multiscale theoretical study aimed at providing insights into the structural, photophysical and optical properties of RB and its aggregates is presented. At the macroscopic level (no atomistic details) a phenomenological model describing the fluorescence decay of RB networks in presence of both quenching from dimers and exciton-exciton annihiliation is presented and analysed, showing that the quenching from dimers affects the decay only at long times, a feature that can be exploited in global fitting analysis to determine relevant chemical and photophysical information. At the mesoscopic level (atomistic details but no electronic structure) the RB aggregation in water in presence of different counterions is studied with molecular dynamics (MD) simulations. A new force field has been parametrized for describing the RB flexibility and the RB-RB interaction driving the dimerization. Simulations correctly predict the RB/counterion aggregation only in presence of bulky and hydrophobic counterion and its ability to prevent the dimerization. Finally, at the microscopic level, DFT calculations are performed to demonstrate the spacing action of bulky counterions, but standard TDDFT calculations are showed to fail in correctly describing the excited states of RB and its dimers. Moreover, also standard procedures proposed in literature for obtaining ad hoc functionals are showed to not work properly. A detailed analysis on the effect of the exact exchange shows that its short-range contribution is the crucial quantity for ameliorating results, and a new functional containing a proper amount of such an exchange is proposed and successfully tested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El Niño South Oscillation (ENSO) is one climatic phenomenon related to the inter-annual variability of global meteorological patterns influencing sea surface temperature and rainfall variability. It influences human health indirectly through extreme temperature and moisture conditions that may accelerate the spread of some vector-borne viral diseases, like dengue fever (DF). This work examines the spatial distribution of association between ENSO and DF in the countries of the Americas during 1995-2004, which includes the 1997-1998 El Niño, one of the most important climatic events of 20(th) century. Data regarding the South Oscillation index (SOI), indicating El Niño-La Niña activity, were obtained from Australian Bureau of Meteorology. The annual DF incidence (AIy) by country was computed using Pan-American Health Association data. SOI and AIy values were standardised as deviations from the mean and plotted in bars-line graphics. The regression coefficient values between SOI and AIy (rSOI,AI) were calculated and spatially interpolated by an inverse distance weighted algorithm. The results indicate that among the five years registering high number of cases (1998, 2002, 2001, 2003 and 1997), four had El Niño activity. In the southern hemisphere, the annual spatial weighted mean centre of epidemics moved southward, from 6° 31' S in 1995 to 21° 12' S in 1999 and the rSOI,AI values were negative in Cuba, Belize, Guyana and Costa Rica, indicating a synchrony between higher DF incidence rates and a higher El Niño activity. The rSOI,AI map allows visualisation of a graded surface with higher values of ENSO-DF associations for Mexico, Central America, northern Caribbean islands and the extreme north-northwest of South America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, all publicly-accessible published findings on Alicyclobacillus acidoterrestris heat resistance in fruit beverages as affected by temperature and pH were compiled. Then, study characteristics (protocols, fruit and variety, °Brix, pH, temperature, heating medium, culture medium, inactivation method, strains, etc.) were extracted from the primary studies, and some of them incorporated to a meta-analysis mixed-effects linear model based on the basic Bigelow equation describing the heat resistance parameters of this bacterium. The model estimated mean D* values (time needed for one log reduction at a temperature of 95 °C and a pH of 3.5) of Alicyclobacillus in beverages of different fruits, two different concentration types, with and without bacteriocins, and with and without clarification. The zT (temperature change needed to cause one log reduction in D-values) estimated by the meta-analysis model were compared to those ('observed' zT values) reported in the primary studies, and in all cases they were within the confidence intervals of the model. The model was capable of predicting the heat resistance parameters of Alicyclobacillus in fruit beverages beyond the types available in the meta-analytical data. It is expected that the compilation of the thermal resistance of Alicyclobacillus in fruit beverages, carried out in this study, will be of utility to food quality managers in the determination or validation of the lethality of their current heat treatment processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Below cloud scavenging processes have been investigated considering a numerical simulation, local atmospheric conditions and particulate matter (PM) concentrations, at different sites in Germany. The below cloud scavenging model has been coupled with bulk particulate matter counter TSI (Trust Portacounter dataset, consisting of the variability prediction of the particulate air concentrations during chosen rain events. The TSI samples and meteorological parameters were obtained during three winter Campaigns: at Deuselbach, March 1994, consisting in three different events; Sylt, April 1994 and; Freiburg, March 1995. The results show a good agreement between modeled and observed air concentrations, emphasizing the quality of the conceptual model used in the below cloud scavenging numerical modeling. The results between modeled and observed data have also presented high square Pearson coefficient correlations over 0.7 and significant, except the Freiburg Campaign event. The differences between numerical simulations and observed dataset are explained by the wind direction changes and, perhaps, the absence of advection mass terms inside the modeling. These results validate previous works based on the same conceptual model.