18 resultados para Web Applications Engineering
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.
Resumo:
In this paper we address the "skull-stripping" problem in 3D MR images. We propose a new method that employs an efficient and unique histogram analysis. A fundamental component of this analysis is an algorithm for partitioning a histogram based on the position of the maximum deviation from a Gaussian fit. In our experiments we use a comprehensive image database, including both synthetic and real MRI. and compare our method with other two well-known methods, namely BSE and BET. For all datasets we achieved superior results. Our method is also highly independent of parameter tuning and very robust across considerable variations of noise ratio.
Resumo:
This paper discusses some aspects related to Wireless Sensor Networks over the IEEE 802.15.4 standard, and proposes, for the very first time, a mesh network topology with geographic routing integrated to the open Freescale protocol (SMAC - Simple Medium Access Control). For this is proposed the SMAC routing protocol. Before this work the SMAC protocol was suitable to perform one hop communications only. However, with the developed mechanisms, it is possible to use multi-hop communication. Performance results from the implemented protocol are presented and analyzed in order to define important requirements for wireless sensor networks, such as robustness, self-healing property and low latency. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.
Resumo:
This article describes a real-world production planning and scheduling problem occurring at an integrated pulp and paper mill (P&P) which manufactures paper for cardboard out of produced pulp. During the cooking of wood chips in the digester, two by-products are produced: the pulp itself (virgin fibers) and the waste stream known as black liquor. The former is then mixed with recycled fibers and processed in a paper machine. Here, due to significant sequence-dependent setups in paper type changeovers, sizing and sequencing of lots have to be made simultaneously in order to efficiently use capacity. The latter is converted into electrical energy using a set of evaporators, recovery boilers and counter-pressure turbines. The planning challenge is then to synchronize the material flow as it moves through the pulp and paper mills, and energy plant, maximizing customer demand (as backlogging is allowed), and minimizing operation costs. Due to the intensive capital feature of P&P, the output of the digester must be maximized. As the production bottleneck is not fixed, to tackle this problem we propose a new model that integrates the critical production units associated to the pulp and paper mills, and energy plant for the first time. Simple stochastic mixed integer programming based local search heuristics are developed to obtain good feasible solutions for the problem. The benefits of integrating the three stages are discussed. The proposed approaches are tested on real-world data. Our work may help P&P companies to increase their competitiveness and reactiveness in dealing with demand pattern oscillations. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
According to recent research carried out in the foundry sector, one of the most important concerns of the industries is to improve their production planning. A foundry production plan involves two dependent stages: (1) determining the alloys to be merged and (2) determining the lots that will be produced. The purpose of this study is to draw up plans of minimum production cost for the lot-sizing problem for small foundries. As suggested in the literature, the proposed heuristic addresses the problem stages in a hierarchical way. Firstly, the alloys are determined and, subsequently, the items that are produced from them. In this study, a knapsack problem as a tool to determine the items to be produced from furnace loading was proposed. Moreover, we proposed a genetic algorithm to explore some possible sets of alloys and to determine the production planning for a small foundry. Our method attempts to overcome the difficulties in finding good production planning presented by the method proposed in the literature. The computational experiments show that the proposed methods presented better results than the literature. Furthermore, the proposed methods do not need commercial software, which is favorable for small foundries. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, a new algebraic-graph method for identification of islanding in power system grids is proposed. The proposed method identifies all the possible cases of islanding, due to the loss of a equipment, by means of a factorization of the bus-branch incidence matrix. The main features of this new method include: (i) simple implementation, (ii) high speed, (iii) real-time adaptability, (iv) identification of all islanding cases and (v) identification of the buses that compose each island in case of island formation. The method was successfully tested on large-scale systems such as the reduced south Brazilian system (45 buses/72 branches) and the south-southeast Brazilian system (810 buses/1340 branches). (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The growing demands for industrial products are imposing an increasingly intense level of competitiveness on the industrial operations. In the meantime, the convergence of information technology (IT) and automation technology (AT) is showing itself to be a tool of great potential for the modernization and improvement of industrial plants. However, for this technology fully to achieve its potential, several obstacles need to be overcome, including the demonstration of the reasoning behind estimations of benefits, investments and risks used to plan the implementation of corporative technology solutions. This article focuses on the evolutionary development of planning and adopting processes of IT & AT convergence. It proposes the incorporation of IT & AT convergence practices into Lean Thinking/Six Sigma, via the method used for planning the convergence of technological activities, known as the Smarter Operation Transformation (SOT) methodology. This article illustrates the SOT methodology through its application in a Brazilian company in the sector of consumer goods. In this application, it is shown that with IT & AT convergence is possible with low investment, in order to reduce the risk of not achieving the goals of key indicators.
Resumo:
This work deals with MEH-PPV thin films to be used as gamma radiation sensors. The polymer thin films with two different thicknesses (30 and 100 nm) were irradiated at room temperature with different gamma radiation doses (up to 25 kGy). Optical properties of the material were investigated with FTIR and UV-Vis absorption spectroscopy. Results show that gamma radiation does not degrade substantially the thin-film material, suggesting that a crosslink effect may be occurring. The characteristic absorption peak of MEH-PPV, around 500 nm is shifted to shorter wavelengths with the increase of gamma radiation doses for both thicknesses samples. The 30-nm-thick samples showed a larger variation absorbance at a specific wavelength and a larger peak shift. These results indicate their potential for use in monitoring the radiation doses used on the sterilization of health care products.
Resumo:
In this work we introduce a relaxed version of the constant positive linear dependence constraint qualification (CPLD) that we call RCPLD. This development is inspired by a recent generalization of the constant rank constraint qualification by Minchenko and Stakhovski that was called RCRCQ. We show that RCPLD is enough to ensure the convergence of an augmented Lagrangian algorithm and that it asserts the validity of an error bound. We also provide proofs and counter-examples that show the relations of RCRCQ and RCPLD with other known constraint qualifications. In particular, RCPLD is strictly weaker than CPLD and RCRCQ, while still stronger than Abadie's constraint qualification. We also verify that the second order necessary optimality condition holds under RCRCQ.
Resumo:
In recent years, different beta titanium alloys have been developed for biomedical applications with a combination of mechanical properties including a low Young's modulus, high strength, fatigue resistance and good ductility with excellent corrosion resistance. From this perspective, a new metastable beta titanium Ti-12Mo-3Nb alloy was developed with the replacement of both vanadium and aluminum from the traditional Ti-6Al-4V alloy. This paper presents the microstructure, mechanical properties and corrosion resistance of the Ti-12Mo-3Nb alloy heat-treated at 950 degrees C for 1 h. The material was characterized by X-ray diffraction and by scanning electron microscopy. Tensile tests were carried out at room temperature. Corrosion tests were performed using Ringer's solution at 25 degrees C. The results showed that this alloy could potentially be used for biomedical purposes due to its good mechanical properties and spontaneous passivation. (c) 2011 Elsevier B.V. All rights reserved.
Resumo:
We developed cationic liposomes containing DNA through a conventional process involving steps of (i) preformation of liposomes, (ii) extrusion, (iii) drying and rehydration and (iv) DNA complexation. Owing to its high prophylactic potentiality against tuberculosis, which had already been demonstrated in preclinical assays, we introduced modifications into the conventional process towards getting a simpler and more economical process for further scale-up. Elimination of the extrusion step, increasing the lipid concentration (from 16 to 64 mM) of the preformed liposomes and using good manufacturing practice bulk lipids (96-98% purity) instead of analytical grade purity lipids (99.9-100%) were the modifications studied. The differences in the physico-chemical properties, such as average diameter, zeta potential, melting point and morphology of the liposomes prepared through the modified process, were not as significant for the biological properties, such as DNA loading on the cationic liposomes, and effective immune response in mice after immunisation as the control liposomes prepared through the conventional process. Beneficially, the modified process increased productivity by 22% and reduced the cost of raw material by 75%.
Resumo:
Objectives: Over the last years, it is known that in some cases metal devices for biomedical applications present some disadvantages suggesting absorbable materials (natural or synthetic) as an alternative of choice. Here, our goal was to evaluate the biological response of a xenogenic pin, derived from bovine cortical bone, intraosseously implanted in the femur of rats. Material and methods: After 10, 14, 30 and 60 days from implantation, the animals (n = 5/period) were killed and the femurs carefully collected and dissected out under histological demands. For identifying the osteoclastogenesis level at 60 days, we performed the immunohistochemisty approach using antibody against RANKL. Results: Interestingly, our results showed that the incidence of neutrophils and leukocytes was observed only at the beginning (10 days). Clear evidences of pin degradation by host cells started at 14 days and it was more intensive at 60 days, when we detected the majority of the presence of giant multinucleated cells, which were very similar to osteoclast cells contacting the implanted pin. To check osteoclastogenesis at 60 days, we evaluated RANKL expression and it was positive for those resident multinucleated cells while a new bone deposition was verified surrounding the pins in all evaluated periods. Conclusions: Altogether, our results showed that pins from fully processed bovine bone are biocompatible and absorbable, allowing bone neoformation and it is a promissory device for biomedical applications.
Resumo:
Cone beam computed tomography (CBCT) can be considered as a valuable imaging modality for improving diagnosis and treatment planning to achieve true guidance for several craniofacial surgical interventions. A new concept and perspective in medical informatics is the highlight discussion about the new imaging interactive workflow. The aim of this article was to present, in a short literature review, the usefulness of CBCT technology as an important alternative imaging modality, highlighting current practices and near-term future applications in cutting-edge thought-provoking perspectives for craniofacial surgical assessment. This article explains the state of the art of CBCT improvements, medical workstation, and perspectives of the dedicated unique hardware and software, which can be used from the CBCT source. In conclusion, CBCT technology is developing rapidly, and many advances are on the horizon. Further progress in medical workstations, engineering capabilities, and improvement in independent software-some open source-should be attempted with this new imaging method. The perspectives, challenges, and pitfalls in CBCT will be delineated and evaluated along with the technological developments.
Resumo:
XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.