9 resultados para Electrical and Computer Engineering

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the firing rate properties of a cellular automaton model for a neuronal network with chemical synapses. We propose a simple mechanism in which the nonlocal connections are included, through electrical and chemical synapses. In the latter case, we introduce a time delay which produces self-sustained activity. Nonlocal connections, or shortcuts, are randomly introduced according to a specified connection probability. There is a range of connection probabilities for which neuron firing occurs, as well as a critical probability for which the firing ceases in the absence of time delay. The critical probability for nonlocal shortcuts depends on the network size according to a power-law. We also compute the firing rate amplification factor by varying both the connection probability and the time delay for different network sizes. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An ultrasonometric and computed-tomographic study of bone healing was undertaken using a model of a transverse mid-shaft osteotomy of sheep tibiae fixed with a semi-flexible external fixator. Fourteen sheep were operated and divided into two groups of seven according to osteotomy type, either regular or by segmental resection. The animals were killed on the 90th postoperative day and the tibiae resected for the in vitro direct contact transverse and axial measurement of ultrasound propagation velocity (UV) followed by quantitative computer-aided tomography (callus density and volume) through the osteotomy site. The intact left tibiae were used for control, being examined in a symmetrical diaphyseal segment. Regular osteotomies healed with a smaller and more mature callus than resection osteotomies. Axial UV was consistently and significantly higher (p?=?0.01) than transverse UV and both transverse and axial UV were significantly higher for the regular than for the segmental resection osteotomy. Transverse UV did not differ significantly between the intact and operated tibiae (p?=?0.20 for regular osteotomy; p?=?0.02 for resection osteotomy), but axial UV was significantly higher for the intact tibiae. Tomographic callus density was significantly higher for the regular than for the resection osteotomy and higher than both for the intact tibiae, presenting a strong positive correlation with UV. Callus volume presented an opposite behavior, with a negative correlation with UV. We conclude that UV is at least as precise as quantitative tomography for providing information about the healing state of both regular and resection osteotomy. (C) 2011 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 30:10761082, 2012

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-supervised learning is one of the important topics in machine learning, concerning with pattern classification where only a small subset of data is labeled. In this paper, a new network-based (or graph-based) semi-supervised classification model is proposed. It employs a combined random-greedy walk of particles, with competition and cooperation mechanisms, to propagate class labels to the whole network. Due to the competition mechanism, the proposed model has a local label spreading fashion, i.e., each particle only visits a portion of nodes potentially belonging to it, while it is not allowed to visit those nodes definitely occupied by particles of other classes. In this way, a "divide-and-conquer" effect is naturally embedded in the model. As a result, the proposed model can achieve a good classification rate while exhibiting low computational complexity order in comparison to other network-based semi-supervised algorithms. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes a real-world production planning and scheduling problem occurring at an integrated pulp and paper mill (P&P) which manufactures paper for cardboard out of produced pulp. During the cooking of wood chips in the digester, two by-products are produced: the pulp itself (virgin fibers) and the waste stream known as black liquor. The former is then mixed with recycled fibers and processed in a paper machine. Here, due to significant sequence-dependent setups in paper type changeovers, sizing and sequencing of lots have to be made simultaneously in order to efficiently use capacity. The latter is converted into electrical energy using a set of evaporators, recovery boilers and counter-pressure turbines. The planning challenge is then to synchronize the material flow as it moves through the pulp and paper mills, and energy plant, maximizing customer demand (as backlogging is allowed), and minimizing operation costs. Due to the intensive capital feature of P&P, the output of the digester must be maximized. As the production bottleneck is not fixed, to tackle this problem we propose a new model that integrates the critical production units associated to the pulp and paper mills, and energy plant for the first time. Simple stochastic mixed integer programming based local search heuristics are developed to obtain good feasible solutions for the problem. The benefits of integrating the three stages are discussed. The proposed approaches are tested on real-world data. Our work may help P&P companies to increase their competitiveness and reactiveness in dealing with demand pattern oscillations. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Creating high-quality quad meshes from triangulated surfaces is a highly nontrivial task that necessitates consideration of various application specific metrics of quality. In our work, we follow the premise that automatic reconstruction techniques may not generate outputs meeting all the subjective quality expectations of the user. Instead, we put the user at the center of the process by providing a flexible, interactive approach to quadrangulation design. By combining scalar field topology and combinatorial connectivity techniques, we present a new framework, following a coarse to fine design philosophy, which allows for explicit control of the subjective quality criteria on the output quad mesh, at interactive rates. Our quadrangulation framework uses the new notion of Reeb atlas editing, to define with a small amount of interactions a coarse quadrangulation of the model, capturing the main features of the shape, with user prescribed extraordinary vertices and alignment. Fine grain tuning is easily achieved with the notion of connectivity texturing, which allows for additional extraordinary vertices specification and explicit feature alignment, to capture the high-frequency geometries. Experiments demonstrate the interactivity and flexibility of our approach, as well as its ability to generate quad meshes of arbitrary resolution with high-quality statistics, while meeting the user's own subjective requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study evaluated the functional and quantitative differences between the early and delayed use of phototherapy in crushed median nerves. After a crush injury, low-level laser therapy (GaAs) was applied transcutaneously at the injury site, 3 min daily, with a frequency of five treatments per week for 2 weeks. In the early group, the first laser treatment started immediately after surgery, and in the delayed group, after 7 days. The grasping test was used for functional evaluation of the median nerve, before, 10, and 21 days after surgery, when the rats were killed. Three segments of the median nerve were analyzed histomorphometrically by light microscopy and computer analysis. The following features were observed: myelinated fiber and axon diameters, myelin sheath area, g-ratio, density and number of myelinated fibers, and area and number of capillaries. In the proximal segment (site of crush), the nerves of animals submitted to early and delayed treatment showed myelinated fiber diameter and myelin sheath area significantly larger compared to the untreated group. In the distal segment, the myelin sheath area was significantly smaller in the untreated animals compared to the delayed group. The untreated, early, and delayed groups presented a 50, 57, and 81% degree of functional recovery, respectively, at 21 days after injury, with a significant difference between the untreated and delayed groups. The results suggest that the nerves irradiated with low-power laser exhibit myelinated fibers of greater diameter and a better recovery of function.