887 resultados para Computer forensic analysis
Resumo:
In this paper we address the "skull-stripping" problem in 3D MR images. We propose a new method that employs an efficient and unique histogram analysis. A fundamental component of this analysis is an algorithm for partitioning a histogram based on the position of the maximum deviation from a Gaussian fit. In our experiments we use a comprehensive image database, including both synthetic and real MRI. and compare our method with other two well-known methods, namely BSE and BET. For all datasets we achieved superior results. Our method is also highly independent of parameter tuning and very robust across considerable variations of noise ratio.
Resumo:
The aim of the present study was to determine clinical parameters for the use of Er,Cr:YSGG laser in the treatment of dentine hypersensitivity. Two antagonist areas were determined as control and experimental areas for irradiation in 90 premolar roots. Each surface was conditioned with 24% EDTA (sub-group 1) and 35% phosphoric acid (sub-group 2) and irradiated with the following settings: 1) Er:YAG, 60 mJ, 2 Hz, defocused; groups 2 to 9: irradiation with Er,Cr:YSGG laser, 20 Hz, Z6 tip, 0% of air and water: 2) Er,Cr:YSGG 0.25 W; 3) 0.5 W; 4) 0.75 W; 5) 1.0 W; 6) 1.25 W, 7) 1.50 W, 8) 2 W; 9) 2 W. After irradiation, samples were immersed in methylene blue solution and included in epoxy resin to obtain longitudinal cuts. The images were digitalized and analyzed by computer software. Although the samples irradiated with Er:YAG laser showed less microleakage, sub-group 1 showed differences between the groups, differing statistically from groups 3, 6, and 9. The results of sub-group 2 showed that the mean values of Er:YAG samples showed a negative trend, however, no differences were detected between the groups. For scanning electron microscopy analysis, dentine squares were obtained and prepared to evaluate the superficial morphology. Partial closure of dentinal tubules was observed after irradiation with Er:YAG and Er,Cr:YSGG laser in the 0.25 and 0.50 W protocols. As the energy densities rose, open dentinal tubules, carbonization and cracks were observed. It can be concluded that none of the parameters were capable of eliminating microleakage, however, clinical studies with Er:YAG and Er,Cr:YSGG lasers should be conducted with the lowest protocols in order to determine the most satisfactory setting for dentine hypersensitivity.
Resumo:
The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in Sao Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p < 0.05). The operators were mainly female and young (from 15 to 24 years old). The call center was opened 24 hours and the operators weekly hours were 36 hours with break time from 21 to 35 minutes per day. The symptoms reported were eye fatigue (73.9%), "weight" in the eyes (68.2%), "burning" eyes (54.6%), tearing (43.9%) and weakening of vision (43.5%). The prevalence of Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.
Resumo:
This qualitative, exploratory, descriptive study was performed with the objective of understanding the perception of the nurses working in medical-surgical units of a university hospital, regarding the strategies developed to perform a pilot test of the PROCEnf-USP electronic system, with the purpose of computerizing clinical nursing documentation. Eleven nurses of a theoretical-practical training program were interviewed and the obtained data were analyzed using the Content Analysis Technique. The following categories were discussed based on the references of participative management and planned changes: favorable aspects for the implementation; unfavorable aspects for the implementation; and expectations regarding the implementation. According to the nurses' perceptions, the preliminary use of the electronic system allowed them to show their potential and to propose improvements, encouraging them to become partners of the group manager in the dissemination to other nurses of the institution.
Resumo:
A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.
Resumo:
Electrochemical lead analyses of gunshot residues (GSRs) were performed using an acidic solution with a bare gold microelectrode in the presence of chloride ions. GSRs from four different guns (0.38 in. revolver, 12 caliber pump-action shotgun, 0.38 repeating rifle, and a 0.22 caliber semi-automatic rifle) and six different types of ammunition (CleanRange (R), normal, semi-jacketed, especial 24g (R), 3T (R), CBC (R), and Eley (R)) were analyzed. Results obtained with the proposed methodology were compared with those from an atomic absorption spectrometry analysis, and a paired Student's t-test indicated that there was no significant difference between them at the 95% confidence level. With this methodology, a detection limit of 1.7 nmol L-1 (3 sigma/slope), a linear range between 10 and 100 nmol L-1, and a relative standard deviation of 2.5% from 10 measurements were obtained. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The Primary Care Information System (SIAB) concentrates basic healthcare information from all different regions of Brazil. The information is collected by primary care teams on a paper-based procedure that degrades the quality of information provided to the healthcare authorities and slows down the process of decision making. To overcome these problems we propose a new data gathering application that uses a mobile device connected to a 3G network and a GPS to be used by the primary care teams for collecting the families' data. A prototype was developed in which a digital version of one SIAB form is made available at the mobile device. The prototype was tested in a basic healthcare unit located in a suburb of Sao Paulo. The results obtained so far have shown that the proposed process is a better alternative for data collecting at primary care, both in terms of data quality and lower deployment time to health care authorities.
Resumo:
Background: Although the molecular pathogenesis of pituitary adenomas has been assessed by several different techniques, it still remains partially unclear. Ribosomal proteins (RPs) have been recently related to human tumorigenesis, but they have not yet been evaluated in pituitary tumorigenesis. Objective: The aim of this study was to introduce serial analysis of gene expression (SAGE), a high-throughput method, in pituitary research in order to compare differential gene expression. Methods: Two SAGE cDNA libraries were constructed, one using a pool of mRNA obtained from five GH-secreting pituitary tumors and another from three normal pituitaries. Genes differentially expressed between the libraries were further validated by real-time PCR in 22 GH-secreting pituitary tumors and in 15 normal pituitaries. Results: Computer-generated genomic analysis tools identified 13 722 and 14 993 exclusive genes in normal and adenoma libraries respectively. Both shared 6497 genes, 2188 were underexpressed and 4309 overexpressed in tumoral library. In adenoma library, 33 genes encoding RPs were underexpressed. Among these, RPSA, RPS3, RPS14, and RPS29 were validated by real-time PCR. Conclusion: We report the first SAGE library from normal pituitary tissue and GH-secreting pituitary tumor, which provide quantitative assessment of cellular transcriptome. We also validated some downregulated genes encoding RPs. Altogether, the present data suggest that the underexpression of the studied RP genes possibly collaborates directly or indirectly with other genes to modify cell cycle arrest, DNA repair, and apoptosis, leading to an environment that might have a putative role in the tumorigenesis, introducing new perspectives for further studies on molecular genesis of somatotrophinomas.
Resumo:
Facial reconstruction is a method that seeks to recreate a person's facial appearance from his/her skull. This technique can be the last resource used in a forensic investigation, when identification techniques such as DNA analysis, dental records, fingerprints and radiographic comparison cannot be used to identify a body or skeletal remains. To perform facial reconstruction, the data of facial soft tissue thickness are necessary. Scientific literature has described differences in the thickness of facial soft tissue between ethnic groups. There are different databases of soft tissue thickness published in the scientific literature. There are no literature records of facial reconstruction works carried out with data of soft tissues obtained from samples of Brazilian subjects. There are also no reports of digital forensic facial reconstruction performed in Brazil. There are two databases of soft tissue thickness published for the Brazilian population: one obtained from measurements performed in fresh cadavers (fresh cadavers' pattern), and another from measurements using magnetic resonance imaging (Magnetic Resonance pattern). This study aims to perform three different characterized digital forensic facial reconstructions (with hair, eyelashes and eyebrows) of a Brazilian subject (based on an international pattern and two Brazilian patterns for soft facial tissue thickness), and evaluate the digital forensic facial reconstructions comparing them to photos of the individual and other nine subjects. The DICOM data of the Computed Tomography (CT) donated by a volunteer were converted into stereolitography (STL) files and used for the creation of the digital facial reconstructions. Once the three reconstructions were performed, they were compared to photographs of the subject who had the face reconstructed and nine other subjects. Thirty examiners participated in this recognition process. The target subject was recognized by 26.67% of the examiners in the reconstruction performed with the Brazilian Magnetic Resonance Pattern, 23.33% in the reconstruction performed with the Brazilian Fresh Cadavers Pattern and 20.00% in the reconstruction performed with the International Pattern, in which the target-subject was the most recognized subject in the first two patterns. The rate of correct recognitions of the target subject indicate that the digital forensic facial reconstruction, conducted with parameters used in this study, may be a useful tool. (C) 2011 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Dimensionality reduction is employed for visual data analysis as a way to obtaining reduced spaces for high dimensional data or to mapping data directly into 2D or 3D spaces. Although techniques have evolved to improve data segregation on reduced or visual spaces, they have limited capabilities for adjusting the results according to user's knowledge. In this paper, we propose a novel approach to handling both dimensionality reduction and visualization of high dimensional data, taking into account user's input. It employs Partial Least Squares (PLS), a statistical tool to perform retrieval of latent spaces focusing on the discriminability of the data. The method employs a training set for building a highly precise model that can then be applied to a much larger data set very effectively. The reduced data set can be exhibited using various existing visualization techniques. The training data is important to code user's knowledge into the loop. However, this work also devises a strategy for calculating PLS reduced spaces when no training data is available. The approach produces increasingly precise visual mappings as the user feeds back his or her knowledge and is capable of working with small and unbalanced training sets.
Resumo:
Xylanases (EC 3.2.1.8 endo-1,4-glycosyl hydrolase) catalyze the hydrolysis of xylan, an abundant hemicellulose of plant cell walls. Access to the catalytic site of GH11 xylanases is regulated by movement of a short beta-hairpin, the so-called thumb region, which can adopt open or closed conformations. A crystallographic study has shown that the D11F/R122D mutant of the GH11 xylanase A from Bacillus subtilis (BsXA) displays a stable "open" conformation, and here we report a molecular dynamics simulation study comparing this mutant with the native enzyme over a range of temperatures. The mutant open conformation was stable at 300 and 328 K, however it showed a transition to the closed state at 338 K. Analysis of dihedral angles identified thumb region residues Y113 and T123 as key hinge points which determine the open-closed transition at 338 K. Although the D11F/R122D mutations result in a reduction in local inter-intramolecular hydrogen bonding, the global energies of the open and closed conformations in the native enzyme are equivalent, suggesting that the two conformations are equally accessible. These results indicate that the thumb region shows a broader degree of energetically permissible conformations which regulate the access to the active site region. The R122D mutation contributes to the stability of the open conformation, but is not essential for thumb dynamics, i.e., the wild type enzyme can also adapt to the open conformation.
Resumo:
Solution of structural reliability problems by the First Order method require optimization algorithms to find the smallest distance between a limit state function and the origin of standard Gaussian space. The Hassofer-Lind-Rackwitz-Fiessler (HLRF) algorithm, developed specifically for this purpose, has been shown to be efficient but not robust, as it fails to converge for a significant number of problems. On the other hand, recent developments in general (augmented Lagrangian) optimization techniques have not been tested in aplication to structural reliability problems. In the present article, three new optimization algorithms for structural reliability analysis are presented. One algorithm is based on the HLRF, but uses a new differentiable merit function with Wolfe conditions to select step length in linear search. It is shown in the article that, under certain assumptions, the proposed algorithm generates a sequence that converges to the local minimizer of the problem. Two new augmented Lagrangian methods are also presented, which use quadratic penalties to solve nonlinear problems with equality constraints. Performance and robustness of the new algorithms is compared to the classic augmented Lagrangian method, to HLRF and to the improved HLRF (iHLRF) algorithms, in the solution of 25 benchmark problems from the literature. The new proposed HLRF algorithm is shown to be more robust than HLRF or iHLRF, and as efficient as the iHLRF algorithm. The two augmented Lagrangian methods proposed herein are shown to be more robust and more efficient than the classical augmented Lagrangian method.
Resumo:
This work presents major results from a novel dynamic model intended to deterministically represent the complex relation between HIV-1 and the human immune system. The novel structure of the model extends previous work by representing different host anatomic compartments under a more in-depth cellular and molecular immunological phenomenology. Recently identified mechanisms related to HIV-1 infection as well as other well known relevant mechanisms typically ignored in mathematical models of HIV-1 pathogenesis and immunology, such as cell-cell transmission, are also addressed. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a technique for performing analog design synthesis at circuit level providing feedback to the designer through the exploration of the Pareto frontier. A modified simulated annealing which is able to perform crossover with past anchor points when a local minimum is found which is used as the optimization algorithm on the initial synthesis procedure. After all specifications are met, the algorithm searches for the extreme points of the Pareto frontier in order to obtain a non-exhaustive exploration of the Pareto front. Finally, multi-objective particle swarm optimization is used to spread the results and to find a more accurate frontier. Piecewise linear functions are used as single-objective cost functions to produce a smooth and equal convergence of all measurements to the desired specifications during the composition of the aggregate objective function. To verify the presented technique two circuits were designed, which are: a Miller amplifier with 96 dB Voltage gain, 15.48 MHz unity gain frequency, slew rate of 19.2 V/mu s with a current supply of 385.15 mu A, and a complementary folded cascode with 104.25 dB Voltage gain, 18.15 MHz of unity gain frequency and a slew rate of 13.370 MV/mu s. These circuits were synthesized using a 0.35 mu m technology. The results show that the method provides a fast approach for good solutions using the modified SA and further good Pareto front exploration through its connection to the particle swarm optimization algorithm.
Resumo:
Visual analysis of social networks is usually based on graph drawing algorithms and tools. However, social networks are a special kind of graph in the sense that interpretation of displayed relationships is heavily dependent on context. Context, in its turn, is given by attributes associated with graph elements, such as individual nodes, edges, and groups of edges, as well as by the nature of the connections between individuals. In most systems, attributes of individuals and communities are not taken into consideration during graph layout, except to derive weights for force-based placement strategies. This paper proposes a set of novel tools for displaying and exploring social networks based on attribute and connectivity mappings. These properties are employed to layout nodes on the plane via multidimensional projection techniques. For the attribute mapping, we show that node proximity in the layout corresponds to similarity in attribute, leading to easiness in locating similar groups of nodes. The projection based on connectivity yields an initial placement that forgoes force-based or graph analysis algorithm, reaching a meaningful layout in one pass. When a force algorithm is then applied to this initial mapping, the final layout presents better properties than conventional force-based approaches. Numerical evaluations show a number of advantages of pre-mapping points via projections. User evaluation demonstrates that these tools promote ease of manipulation as well as fast identification of concepts and associations which cannot be easily expressed by conventional graph visualization alone. In order to allow better space usage for complex networks, a graph mapping on the surface of a sphere is also implemented.