16 resultados para 190202 Computer Gaming and Animation

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multi-pumping flow system exploiting prior assay is proposed for sequential turbidimetric determination of sulphate and chloride in natural waters. Both methods are implemented in the same manifold that provides facilities for: in-line sample clean-up with a Bio-Rex 70 mini-column with fluidized beads: addition of low amounts of sulphate or chloride ions to the reaction medium for improving supersaturation; analyte precipitation with Ba(2+) or Ag(+); real-time decision on the need for next assay. The sample is initially run for chloride determination, and the analytical signal is compared with a preset value. If higher, the sample is run again, now for sulphate determination. The strategy may lead to all increased sample throughput. The proposed system is computer-controlled and presents enhanced figures of merit. About 10 samples are run per hour (about 60 measurements) and results are reproducible and Unaffected by the presence of potential interfering ions at concentration levels usually found in natural waters. Accuracy was assessed against ion chromatography. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design, construction, and characterization of a portable opto-coupled potentiostat are presented. The potentiostat is battery-powered, managed by a microcontroller, which implements cyclic voltammetry (CV) using suitable sensor electrodes. Its opto-coupling permits a wide range of current measurements, varying from mA to nA. Two software interfaces were developed to perform the CV measurement: a virtual instrument for a personal computer (PC) and a C-base interface for personal digital assistant (PDA). The potentiostat has been evaluated by detection of potassium ferrocyanide in KCl medium, both with macro and microelectrodes. There was good agreement between the instrumental results and those from commercial equipment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diffusion of astrophysical magnetic fields in conducting fluids in the presence of turbulence depends on whether magnetic fields can change their topology via reconnection in highly conducting media. Recent progress in understanding fast magnetic reconnection in the presence of turbulence reassures that the magnetic field behavior in computer simulations and turbulent astrophysical environments is similar, as far as magnetic reconnection is concerned. This makes it meaningful to perform MHD simulations of turbulent flows in order to understand the diffusion of magnetic field in astrophysical environments. Our studies of magnetic field diffusion in turbulent medium reveal interesting new phenomena. First of all, our three-dimensional MHD simulations initiated with anti-correlating magnetic field and gaseous density exhibit at later times a de-correlation of the magnetic field and density, which corresponds well to the observations of the interstellar media. While earlier studies stressed the role of either ambipolar diffusion or time-dependent turbulent fluctuations for de-correlating magnetic field and density, we get the effect of permanent de-correlation with one fluid code, i.e., without invoking ambipolar diffusion. In addition, in the presence of gravity and turbulence, our three-dimensional simulations show the decrease of the magnetic flux-to-mass ratio as the gaseous density at the center of the gravitational potential increases. We observe this effect both in the situations when we start with equilibrium distributions of gas and magnetic field and when we follow the evolution of collapsing dynamically unstable configurations. Thus, the process of turbulent magnetic field removal should be applicable both to quasi-static subcritical molecular clouds and cores and violently collapsing supercritical entities. The increase of the gravitational potential as well as the magnetization of the gas increases the segregation of the mass and magnetic flux in the saturated final state of the simulations, supporting the notion that the reconnection-enabled diffusivity relaxes the magnetic field + gas system in the gravitational field to its minimal energy state. This effect is expected to play an important role in star formation, from its initial stages of concentrating interstellar gas to the final stages of the accretion to the forming protostar. In addition, we benchmark our codes by studying the heat transfer in magnetized compressible fluids and confirm the high rates of turbulent advection of heat obtained in an earlier study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Object selection refers to the mechanism of extracting objects of interest while ignoring other objects and background in a given visual scene. It is a fundamental issue for many computer vision and image analysis techniques and it is still a challenging task to artificial Visual systems. Chaotic phase synchronization takes place in cases involving almost identical dynamical systems and it means that the phase difference between the systems is kept bounded over the time, while their amplitudes remain chaotic and may be uncorrelated. Instead of complete synchronization, phase synchronization is believed to be a mechanism for neural integration in brain. In this paper, an object selection model is proposed. Oscillators in the network representing the salient object in a given scene are phase synchronized, while no phase synchronization occurs for background objects. In this way, the salient object can be extracted. In this model, a shift mechanism is also introduced to change attention from one object to another. Computer simulations show that the model produces some results similar to those observed in natural vision systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes an improved voice activity detection (VAD) algorithm using wavelet and support vector machine (SVM) for European Telecommunication Standards Institution (ETS1) adaptive multi-rate (AMR) narrow-band (NB) and wide-band (WB) speech codecs. First, based on the wavelet transform, the original IIR filter bank and pitch/tone detector are implemented, respectively, via the wavelet filter bank and the wavelet-based pitch/tone detection algorithm. The wavelet filter bank can divide input speech signal into several frequency bands so that the signal power level at each sub-band can be calculated. In addition, the background noise level can be estimated in each sub-band by using the wavelet de-noising method. The wavelet filter bank is also derived to detect correlated complex signals like music. Then the proposed algorithm can apply SVM to train an optimized non-linear VAD decision rule involving the sub-band power, noise level, pitch period, tone flag, and complex signals warning flag of input speech signals. By the use of the trained SVM, the proposed VAD algorithm can produce more accurate detection results. Various experimental results carried out from the Aurora speech database with different noise conditions show that the proposed algorithm gives considerable VAD performances superior to the AMR-NB VAD Options 1 and 2, and AMR-WB VAD. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of study in this paper is the class of packing problems. More specifically, it deals with the placement of a set of N circular items of unitary radius inside an object with the aim of minimizing its dimensions. Differently shaped containers are considered, namely circles, squares, rectangles, strips and triangles. By means of the resolution of non-linear equations systems through the Newton-Raphson method, the herein presented algorithm succeeds in improving the accuracy of previous results attained by continuous optimization approaches up to numerical machine precision. The computer implementation and the data sets are available at http://www.ime.usp.br/similar to egbirgin/packing/. (C) 2009 Elsevier Ltd, All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJETIVO: Com a inclusão das novas tecnologias contemporâneas, a Internet e os jogos eletrônicos tornaram-se ferramentas de uso amplo e irrestrito, transformando-se em um dos maiores fenômenos mundiais da última década. Diversas pesquisas atestam os benefícios desses recursos, mas seu uso sadio e adaptativo progressivamente deu lugar ao abuso e à falta de controle ao criar severos impactos na vida cotidiana de milhões de usuários. O objetivo deste estudo foi revisar de forma sistemática os artigos que examinam a dependência de Internet e jogos eletrônicos na população geral. Almejamos, portanto, avaliar a evolução destes conceitos no decorrer da última década, assim como contribuir para a melhor compreensão do quadro e suas comorbidades. MÉTODO: Foi feita uma revisão sistemática da literatura através do MedLine, Lilacs, SciELO e Cochrane usando-se como parâmetro os termos: "Internet addiction", pathological "Internet use", "problematic Internet use", "Internet abuse", "videogame", "computer games" e "electronic games". A busca eletrônica foi feita até dezembro de 2007. DISCUSSÃO: Estudos realizados em diferentes países apontam para prevalências ainda muito diversas, o que provavelmente se deve à falta de consenso e ao uso de diferentes denominações, dando margem à adoção de distintos critérios diagnósticos. Muitos pacientes que relatam o uso abusivo e dependência passam a apresentar prejuízos significativos na vida profissional, acadêmica (escolar), social e familiar. CONCLUSÕES: São necessárias novas investigações para determinar se esse uso abusivo de Internet e de jogos eletrônicos pode ser compreendido como uma das mais novas classificações psiquiátricas do século XXI ou apenas substratos de outros transtornos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Shallow subsurface layers of gold nanoclusters were formed in polymethylmethacrylate (PMMA) polymer by very low energy (49 eV) gold ion implantation. The ion implantation process was modeled by computer simulation and accurately predicted the layer depth and width. Transmission electron microscopy (TEM) was used to image the buried layer and individual nanoclusters; the layer width was similar to 6-8 nm and the cluster diameter was similar to 5-6 nm. Surface plasmon resonance (SPR) absorption effects were observed by UV-visible spectroscopy. The TEM and SPR results were related to prior measurements of electrical conductivity of Au-doped PMMA, and excellent consistency was found with a model of electrical conductivity in which either at low implantation dose the individual nanoclusters are separated and do not physically touch each other, or at higher implantation dose the nanoclusters touch each other to form a random resistor network (percolation model). (C) 2009 American Vacuum Society. [DOI: 10.1116/1.3231449]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Homozygous familial hypercholesterolaemia is a rare genetic disorder in which both LDL-receptor alleles are defective, resulting in very high concentrations of LDL cholesterol in plasma and premature coronary artery disease. This study investigated whether an antisense inhibitor of apolipoprotein B synthesis, mipomersen, is effective and safe as an adjunctive agent to lower LDL cholesterol concentrations in patients with this disease. Methods This randomised, double-blind, placebo-controlled, phase 3 study was undertaken in nine lipid clinics in seven countries. Patients aged 12 years and older with clinical diagnosis or genetic confirmation of homozygous familial hypercholesterolaemia, who were already receiving the maximum tolerated dose of a lipid-lowering drug, were randomly assigned to mipomersen 200 mg subcutaneously every week or placebo for 26 weeks. Randomisation was computer generated and stratified by weight (<50 kg vs >= 50 kg) in a centralised blocked randomisation, implemented with a computerised interactive voice response system. All clinical, medical, and pharmacy personnel, and patients were masked to treatment allocation. The primary endpoint was percentage change in LDL cholesterol concentration from baseline. Analysis was by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT00607373. Findings 34 patients were assigned to mipomersen and 17 to placebo; data for all patients were analysed. 45 patients completed the 26-week treatment period (28 mipomersen, 17 placebo). Mean concentrations of LDL cholesterol at baseline were 11.4 mmol/L (SD 3.6) in the mipomersen group and 10.4 mmol/L (3.7) in the placebo group. The mean percentage change in LDL cholesterol concentration was significantly greater with mipomersen (-24.7%, 95% CI 31.6 to 17.7) than with placebo (-3.3%, 12.1 to 5.5; p=0.0003). The most common adverse events were injection-site reactions (26 [76%] patients in mipomersen group vs four [24%] in placebo group). Four (12%) patients in the mipomersen group but none in the placebo group had increases in concentrations of alanine aminotransferase of three times or more the upper limit of normal. Interpretation Inhibition of apolipoprotein B synthesis by mipomersen represents a novel, effective therapy to reduce LDL cholesterol concentrations in patients with homozygous familial hypercholesterolaemia who are already receiving lipid-lowering drugs, including high-dose statins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing efforts exist in integrating different levels of detail in models of the cardiovascular system. For instance, one-dimensional representations are employed to model the systemic circulation. In this context, effective and black-box-type decomposition strategies for one-dimensional networks are needed, so as to: (i) employ domain decomposition strategies for large systemic models (1D-1D coupling) and (ii) provide the conceptual basis for dimensionally-heterogeneous representations (1D-3D coupling, among various possibilities). The strategy proposed in this article works for both of these two scenarios, though the several applications shown to illustrate its performance focus on the 1D-1D coupling case. A one-dimensional network is decomposed in such a way that each coupling point connects two (and not more) of the sub-networks. At each of the M connection points two unknowns are defined: the flow rate and pressure. These 2M unknowns are determined by 2M equations, since each sub-network provides one (non-linear) equation per coupling point. It is shown how to build the 2M x 2M non-linear system with arbitrary and independent choice of boundary conditions for each of the sub-networks. The idea is then to solve this non-linear system until convergence, which guarantees strong coupling of the complete network. In other words, if the non-linear solver converges at each time step, the solution coincides with what would be obtained by monolithically modeling the whole network. The decomposition thus imposes no stability restriction on the choice of the time step size. Effective iterative strategies for the non-linear system that preserve the black-box character of the decomposition are then explored. Several variants of matrix-free Broyden`s and Newton-GMRES algorithms are assessed as numerical solvers by comparing their performance on sub-critical wave propagation problems which range from academic test cases to realistic cardiovascular applications. A specific variant of Broyden`s algorithm is identified and recommended on the basis of its computer cost and reliability. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. in this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. (C) 2009 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2006 the Route load balancing algorithm was proposed and compared to other techniques aiming at optimizing the process allocation in grid environments. This algorithm schedules tasks of parallel applications considering computer neighborhoods (where the distance is defined by the network latency). Route presents good results for large environments, although there are cases where neighbors do not have an enough computational capacity nor communication system capable of serving the application. In those situations the Route migrates tasks until they stabilize in a grid area with enough resources. This migration may take long time what reduces the overall performance. In order to improve such stabilization time, this paper proposes RouteGA (Route with Genetic Algorithm support) which considers historical information on parallel application behavior and also the computer capacities and load to optimize the scheduling. This information is extracted by using monitors and summarized in a knowledge base used to quantify the occupation of tasks. Afterwards, such information is used to parameterize a genetic algorithm responsible for optimizing the task allocation. Results confirm that RouteGA outperforms the load balancing carried out by the original Route, which had previously outperformed others scheduling algorithms from literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we deal with the problem of packing (orthogonally and without overlapping) identical rectangles in a rectangle. This problem appears in different logistics settings, such as the loading of boxes on pallets, the arrangements of pallets in trucks and the stowing of cargo in ships. We present a recursive partitioning approach combining improved versions of a recursive five-block heuristic and an L-approach for packing rectangles into larger rectangles and L-shaped pieces. The combined approach is able to rapidly find the optimal solutions of all instances of the pallet loading problem sets Cover I and II (more than 50 000 instances). It is also effective for solving the instances of problem set Cover III (almost 100 000 instances) and practical examples of a woodpulp stowage problem, if compared to other methods from the literature. Some theoretical results are also discussed and, based on them, efficient computer implementations are introduced. The computer implementation and the data sets are available for benchmarking purposes. Journal of the Operational Research Society (2010) 61, 306-320. doi: 10.1057/jors.2008.141 Published online 4 February 2009