58 resultados para Computer forensic analysis
Resumo:
Thanks to recent advances in molecular biology, allied to an ever increasing amount of experimental data, the functional state of thousands of genes can now be extracted simultaneously by using methods such as cDNA microarrays and RNA-Seq. Particularly important related investigations are the modeling and identification of gene regulatory networks from expression data sets. Such a knowledge is fundamental for many applications, such as disease treatment, therapeutic intervention strategies and drugs design, as well as for planning high-throughput new experiments. Methods have been developed for gene networks modeling and identification from expression profiles. However, an important open problem regards how to validate such approaches and its results. This work presents an objective approach for validation of gene network modeling and identification which comprises the following three main aspects: (1) Artificial Gene Networks (AGNs) model generation through theoretical models of complex networks, which is used to simulate temporal expression data; (2) a computational method for gene network identification from the simulated data, which is founded on a feature selection approach where a target gene is fixed and the expression profile is observed for all other genes in order to identify a relevant subset of predictors; and (3) validation of the identified AGN-based network through comparison with the original network. The proposed framework allows several types of AGNs to be generated and used in order to simulate temporal expression data. The results of the network identification method can then be compared to the original network in order to estimate its properties and accuracy. Some of the most important theoretical models of complex networks have been assessed: the uniformly-random Erdos-Renyi (ER), the small-world Watts-Strogatz (WS), the scale-free Barabasi-Albert (BA), and geographical networks (GG). The experimental results indicate that the inference method was sensitive to average degree k variation, decreasing its network recovery rate with the increase of k. The signal size was important for the inference method to get better accuracy in the network identification rate, presenting very good results with small expression profiles. However, the adopted inference method was not sensible to recognize distinct structures of interaction among genes, presenting a similar behavior when applied to different network topologies. In summary, the proposed framework, though simple, was adequate for the validation of the inferred networks by identifying some properties of the evaluated method, which can be extended to other inference methods.
Resumo:
This paper analyses an optical network architecture composed by an arrangement of nodes equipped with multi-granular optical cross-connects (MG-OXCs) in addition to the usual optical cross-connects (OXCs). Then, selected network nodes can perform both waveband as well as traffic grooming operations and our goal is to assess the improvement on network performance brought by these additional capabilities. Specifically, the influence of the MG-OXC multi-granularity on the blocking probability is evaluated for 16 classes of service over a network based on the NSFNet topology. A mechanism of fairness in bandwidth capacity is also added to the connection admission control to manage the blocking probabilities of all kind of bandwidth requirements. Comprehensive computational simulation are carried out to compare eight distinct node architectures, showing that an adequate combination of waveband and single-wavelength ports of the MG-OXCs and OXCs allow a more efficient operation of a WDM optical network carrying multi-rate traffic.
Resumo:
Inverse analysis is currently an important subject of study in several fields of science and engineering. The identification of physical and geometric parameters using experimental measurements is required in many applications. In this work a boundary element formulation to identify boundary and interface values as well as material properties is proposed. In particular the proposed formulation is dedicated to identifying material parameters when a cohesive crack model is assumed for 2D problems. A computer code is developed and implemented using the BEM multi-region technique and regularisation methods to perform the inverse analysis. Several examples are shown to demonstrate the efficiency of the proposed model. (C) 2010 Elsevier Ltd. All rights reserved,
Resumo:
Leakage reduction in water supply systems and distribution networks has been an increasingly important issue in the water industry since leaks and ruptures result in major physical and economic losses. Hydraulic transient solvers can be used in the system operational diagnosis, namely for leak detection purposes, due to their capability to describe the dynamic behaviour of the systems and to provide substantial amounts of data. In this research work, the association of hydraulic transient analysis with an optimisation model, through inverse transient analysis (ITA), has been used for leak detection and its location in an experimental facility containing PVC pipes. Observed transient pressure data have been used for testing ITA. A key factor for the success of the leak detection technique used is the accurate calibration of the transient solver, namely adequate boundary conditions and the description of energy dissipation effects since PVC pipes are characterised by a viscoelastic mechanical response. Results have shown that leaks were located with an accuracy between 4-15% of the total length of the pipeline, depending on the discretisation of the system model.
Resumo:
Swallowing dynamics involves the coordination and interaction of several muscles and nerves which allow correct food transport from mouth to stomach without laryngotracheal penetration or aspiration. Clinical swallowing assessment depends on the evaluator`s knowledge of anatomic structures and of neurophysiological processes involved in swallowing. Any alteration in those steps is denominated oropharyngeal dysphagia, which may have many causes, such as neurological or mechanical disorders. Videofluoroscopy of swallowing is presently considered to be the best exam to objectively assess the dynamics of swallowing, but the exam needs to be conducted under certain restrictions, due to patient`s exposure to radiation, which limits periodical repetition for monitoring swallowing therapy. Another method, called cervical auscultation, is a promising new diagnostic tool for the assessment of swallowing disorders. The potential to diagnose dysphagia in a noninvasive manner by assessing the sounds of swallowing is a highly attractive option for the dysphagia clinician. Even so, the captured sound has an amount of noise, which can hamper the evaluator`s decision. In that way, the present paper proposes the use of a filter to improve the quality of audible sound and facilitate the perception of examination. The wavelet denoising approach is used to decompose the noisy signal. The signal to noise ratio was evaluated to demonstrate the quantitative results of the proposed methodology. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
One of the e-learning environment goal is to attend the individual needs of students during the learning process. The adaptation of contents, activities and tools into different visualization or in a variety of content types is an important feature of this environment, bringing to the user the sensation that there are suitable workplaces to his profile in the same system. Nevertheless, it is important the investigation of student behaviour aspects, considering the context where the interaction happens, to achieve an efficient personalization process. The paper goal is to present an approach to identify the student learning profile analyzing the context of interaction. Besides this, the learning profile could be analyzed in different dimensions allows the system to deal with the different focus of the learning.
Resumo:
In this paper, a comparative analysis of the long-term electric power forecasting methodologies used in some South American countries, is presented. The purpose of this study is to compare and observe if such methodologies have some similarities, and also examine the behavior of the results when they are applied to the Brazilian electric market. The abovementioned power forecasts were performed regarding the main four consumption classes (residential, industrial, commercial and rural) which are responsible for approximately 90% of the national consumption. The tool used in this analysis was the SAS (c) program. The outcome of this study allowed identifying various methodological similarities, mainly those related to the econometric variables used by these methods. This fact strongly conditioned the comparative results obtained.
Resumo:
Most post-processors for boundary element (BE) analysis use an auxiliary domain mesh to display domain results, working against the profitable modelling process of a pure boundary discretization. This paper introduces a novel visualization technique which preserves the basic properties of the boundary element methods. The proposed algorithm does not require any domain discretization and is based on the direct and automatic identification of isolines. Another critical aspect of the visualization of domain results in BE analysis is the effort required to evaluate results in interior points. In order to tackle this issue, the present article also provides a comparison between the performance of two different BE formulations (conventional and hybrid). In addition, this paper presents an overview of the most common post-processing and visualization techniques in BE analysis, such as the classical algorithms of scan line and the interpolation over a domain discretization. The results presented herein show that the proposed algorithm offers a very high performance compared with other visualization procedures.
Resumo:
The TCP/IP architecture was consolidated as a standard to the distributed systems. However, there are several researches and discussions about alternatives to the evolution of this architecture and, in this study area, this work presents the Title Model to contribute with the application needs support by the cross layer ontology use and the horizontal addressing, in a next generation Internet. For a practical viewpoint, is showed the network cost reduction for the distributed programming example, in networks with layer 2 connectivity. To prove the title model enhancement, it is presented the network analysis performed for the message passing interface, sending a vector of integers and returning its sum. By this analysis, it is confirmed that the current proposal allows, in this environment, a reduction of 15,23% over the total network traffic, in bytes.
Resumo:
This work presents a method for predicting resource availability in opportunistic grids by means of use pattern analysis (UPA), a technique based on non-supervised learning methods. This prediction method is based on the assumption of the existence of several classes of computational resource use patterns, which can be used to predict the resource availability. Trace-driven simulations validate this basic assumptions, which also provide the parameter settings for the accurate learning of resource use patterns. Experiments made with an implementation of the UPA method show the feasibility of its use in the scheduling of grid tasks with very little overhead. The experiments also demonstrate the method`s superiority over other predictive and non-predictive methods. An adaptative prediction method is suggested to deal with the lack of training data at initialization. Further adaptative behaviour is motivated by experiments which show that, in some special environments, reliable resource use patterns may not always be detected. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Background: The presence of the periodontal ligament (PDL) makes it possible to absorb and distribute loads produced during masticatory function and other tooth contacts into the alveolar process via the alveolar bone proper. However, several factors affect the integrity of periodontal structures causing the destruction of the connective matrix and cells, the loss of fibrous attachment, and the resorption of alveolar bone. Methods: The purpose of this study was to evaluate the stress distribution by finite element analysis in a PDL in three-dimensional models of the upper central incisor under three different load conditions: 100 N occlusal loading at 45 degrees (model 1: masticatory load); 500 N at the incisal edge at 45 degrees (model 2: parafunctional habit); and 800 N at the buccal surface at 90 degrees (model 3: trauma case). The models were built from computed tomography scans. Results: The stress distribution was quite different among the models. The most significant values (harmful) of tensile and compressive stresses were observed in models 2 and 3, with similarly distinct patterns of stress distributions along the PDL. Tensile stresses were observed along the internal and external aspects of the PDL, mostly at the cervical and middle thirds. Conclusions: The stress generation in these models may affect the integrity of periodontal structures. A better understanding of the biomechanical behavior of the PDL under physiologic and traumatic loading conditions might enhance the understanding of the biologic reaction of the PDL in health and disease. J Periodontol 2009;80:1859-1867.
Resumo:
This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this study, regression models are evaluated for grouped survival data when the effect of censoring time is considered in the model and the regression structure is modeled through four link functions. The methodology for grouped survival data is based on life tables, and the times are grouped in k intervals so that ties are eliminated. Thus, the data modeling is performed by considering the discrete models of lifetime regression. The model parameters are estimated by using the maximum likelihood and jackknife methods. To detect influential observations in the proposed models, diagnostic measures based on case deletion, which are denominated global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to those measures, the local influence and the total influential estimate are also employed. Various simulation studies are performed and compared to the performance of the four link functions of the regression models for grouped survival data for different parameter settings, sample sizes and numbers of intervals. Finally, a data set is analyzed by using the proposed regression models. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Symptoms resembling giant calyx, a graft-transmissible disease, were observed on 1-5% of eggplant (aubergine; Solanum melongena L.) plants in production fields in Sao Paulo state, Brazil. Phytoplasmas were detected in 1 2 of 1 2 samples from symptomatic plants that were analysed by a nested PCR assay employing 16S rRNA gene primers R16mF2/R16mR1 followed by R16F2n/R16R2. RFLP analysis of the resulting rRNA gene products (1.2 kb) indicated that all plants contained similar phytoplasmas, each closely resembling strains previously classified as members of RFLP group 16SrIII (X-disease group). Virtual RFLP and phylogenetic analyses of sequences derived from PCR products identified phytoplasmas infecting eggplant crops grown in Piracicaba as a lineage of the subgroup 16SrIII-J, whereas phytoplasmas detected in plants grown in Braganca Paulista were tentatively classified as members of a novel subgroup 16SrIII-U. These findings confirm eggplant as a new host of group 16SrIII-J phytoplasmas and extend the known diversity of strains belonging to this group in Brazil.
Resumo:
In this preliminary study eighteen p-substituted benzoic acid [(5-nitro-thiophen-2-yl)-methylene]-hydrazides with antimicrobial activity were evaluated against multidrug-resistant Staphylococcus aureus, correlating the three-dimensional characteristics of the ligands with their respective bioactivities. The computer programs Sybyl and CORINA were used, respectively, for the design and three-dimensional conversion of the ligands. Molecular interaction fields were calculated using GRID program. Calculations using Volsurf resulted in a statistically consistent model with 48 structural descriptors showing that hydrophobicity is a fundamental property in the analyzed biological response.