983 resultados para General applications
Resumo:
It has been widely known that a significant part of the bits are useless or even unused during the program execution. Bit-width analysis targets at finding the minimum bits needed for each variable in the program, which ensures the execution correctness and resources saving. In this paper, we proposed a static analysis method for bit-widths in general applications, which approximates conservatively at compile time and is independent of runtime conditions. While most related work focus on integer applications, our method is also tailored and applicable to floating point variables, which could be extended to transform floating point number into fixed point numbers together with precision analysis. We used more precise representations for data value ranges of both scalar and array variables. Element level analysis is carried out for arrays. We also suggested an alternative for the standard fixed-point iterations in bi-directional range analysis. These techniques are implemented on the Trimaran compiler structure and tested on a set of benchmarks to show the results.
Resumo:
This paper develops a novel full analytic model for vibration analysis of solid-state electronic components. The model is just as accurate as finite element models and numerically light enough to permit for quick design trade-offs and statistical analysis. The paper shows the development of the model, comparison to finite elements and an application to a common engineering problem. A gull-wing flat pack component was selected as the benchmark test case, although the presented methodology is applicable to a wide range of component packages. Results showed very good agreement between the presented method and finite elements and demonstrated the usefulness of the method in how to use standard test data for a general application. © 2013 Elsevier Ltd.
Resumo:
The aim of the paper is to discuss the use of knowledge models to formulate general applications. First, the paper presents the recent evolution of the software field where increasing attention is paid to conceptual modeling. Then, the current state of knowledge modeling techniques is described where increased reliability is available through the modern knowledge acquisition techniques and supporting tools. The KSM (Knowledge Structure Manager) tool is described next. First, the concept of knowledge area is introduced as a building block where methods to perform a collection of tasks are included together with the bodies of knowledge providing the basic methods to perform the basic tasks. Then, the CONCEL language to define vocabularies of domains and the LINK language for methods formulation are introduced. Finally, the object oriented implementation of a knowledge area is described and a general methodology for application design and maintenance supported by KSM is proposed. To illustrate the concepts and methods, an example of system for intelligent traffic management in a road network is described. This example is followed by a proposal of generalization for reuse of the resulting architecture. Finally, some concluding comments are proposed about the feasibility of using the knowledge modeling tools and methods for general application design.
Resumo:
A lightweight Java application suite has been developed and deployed allowing collaborative learning between students and tutors at remote locations. Students can engage in group activities online and also collaborate with tutors. A generic Java framework has been developed and applied to electronics, computing and mathematics education. The applications are respectively: (a) a digital circuit simulator, which allows students to collaborate in building simple or complex electronic circuits; (b) a Java programming environment where the paradigm is behavioural-based robotics, and (c) a differential equation solver useful in modelling of any complex and nonlinear dynamic system. Each student sees a common shared window on which may be added text or graphical objects and which can then be shared online. A built-in chat room supports collaborative dialogue. Students can work either in collaborative groups or else in teams as directed by the tutor. This paper summarises the technical architecture of the system as well as the pedagogical implications of the suite. A report of student evaluation is also presented distilled from use over a period of twelve months. We intend this suite to facilitate learning between groups at one or many institutions and to facilitate international collaboration. We also intend to use the suite as a tool to research the establishment and behaviour of collaborative learning groups. We shall make our software freely available to interested researchers.
Resumo:
Identification of post-translational modifications of proteins in biological samples often requires access to preanalytical purification and concentration methods. In the purification step high or low molecular weight substances can be removed by size exclusion filters, and high abundant proteins can be removed, or low abundant proteins can be enriched, by specific capturing tools. In this paper is described the experience and results obtained with a recently emerged and easy-to-use affinity purification kit for enrichment of the low amounts of EPO found in urine and plasma specimens. The kit can be used as a pre-step in the EPO doping control procedure, as an alternative to the commonly used ultrafiltration, for detecting aberrantly glycosylated isoforms. The commercially available affinity purification kit contains small disposable anti-EPO monolith columns (6 ?L volume, Ø7 mm, length 0.15 mm) together with all required buffers. A 24-channel vacuum manifold was used for simultaneous processing of samples. The column concentrated EPO from 20 mL urine down to 55 ?L eluate with a concentration factor of 240 times, while roughly 99.7% of non-relevant urine proteins were removed. The recoveries of Neorecormon (epoetin beta), and the EPO analogues Aranesp and Mircera applied to buffer were high, 76%, 67% and 57%, respectively. The recovery of endogenous EPO from human urine was 65%. High recoveries were also obtained when purifying human, mouse and equine EPO from serum, and human EPO from cerebrospinal fluid. Evaluation with the accredited EPO doping control method based on isoelectric focusing (IEF) showed that the affinity purification procedure did not change the isoform distribution for rhEPO, Aranesp, Mircera or endogenous EPO. The kit should be particularly useful for applications in which it is essential to avoid carry-over effects, a problem commonly encountered with conventional particle-based affinity columns. The encouraging results with EPO propose that similar affinity monoliths, with the appropriate antibodies, should constitute useful tools for general applications in sample preparation, not only for doping control of EPO and other hormones such as growth hormone and insulin but also for the study of post-translational modifications of other low abundance proteins in biological and clinical research, and for sample preparation prior to in vitro diagnostics.
Resumo:
BACKGROUND: Expression of heterologous genes in mammalian cells or organisms for therapeutic or experimental purposes often requires tight control of transgene expression. Specifically, the following criteria should be met: no background gene activity in the off-state, high gene expression in the on-state, regulated expression over an extended period, and multiple switching between on- and off-states. METHODS: Here, we describe a genetic switch system for controlled transgene transcription using chimeric repressor and activator proteins functioning in a novel regulatory network. In the off-state, the target transgene is actively silenced by a chimeric protein consisting of multimerized eukaryotic transcriptional repression domains fused to the DNA-binding tetracycline repressor. In the on-state, the inducer drug doxycycline affects both the derepression of the target gene promoter and activation by the GAL4-VP16 transactivator, which in turn is under the control of an autoregulatory feedback loop. RESULTS: The hallmark of this new system is the efficient transgene silencing in the off-state, as demonstrated by the tightly controlled expression of the highly cytotoxic diphtheria toxin A gene. Addition of the inducer drug allows robust activation of transgene expression. In stably transfected cells, this control is still observed after months of repeated cycling between the repressed and activated states of the target genes. CONCLUSIONS: This system permits tight long-term regulation when stably introduced into cell lines. The underlying principles of this network system should have general applications in biotechnology and gene therapy.
Resumo:
Le présent mémoire est subdivisé en deux principaux sujets. Le premier porte sur le développement d’une hydrolyse de thiazolidines assistée par micro-ondes en vue d’obtenir des cystéines a-substituées. Le second est axé sur le développement d’une méthodologie pour la synthèse catalytique énantiosélective d’alkylidènecyclopropanes 1,1-di-accepteurs. Dans un premier temps, les rôles et les utilités des acides aminés quaternaires, plus spécifiquement des cystéines a-substituées, seront abordés, puis une revue des différentes méthodes énantiosélectives pour accéder à ces unités sera effectuée. Par la suite, le développement d’une méthode rapide et efficace d’hydrolyse sous irradiation aux micro-ondes de thiazolines sera présenté. Finalement, les études menant à l’application de cette méthode à la synthèse de cystéines -substituées sur grande échelle au moyen de réacteurs en écoulement dynamique et à haut criblage seront détaillées. Dans la seconde partie, les applications ainsi que les synthèses générales des alkylidènecyclopropanes en synthèse organique seront décrites. Plus particulièrement, les applications spécifiques des alkylidènecyclopropanes 1,1-di-accepteurs ainsi que leurs synthèses seront traitées de manière exhaustive. Par la suite, le développement d’une méthodologie énantiosélective catalytique pour la synthèse d’alkylidènecyclopropanes 1,1-di-accepteurs sera présenté. L’extension de cette méthodologie à la synthèse de dérivés cyclopropanes et cyclopropènes, ainsi que l’application de réactions stéréospécifiques pour les alkylidènecyclopropanes 1,1-di-accepteurs seront brièvement discutées.
Resumo:
RESUMO: O presente trabalho salienta a importância da aplicação do Marketing às instituições culturais, nomeadamente como veículo de captação e fidelização de públicos. Nesse sentido, foi estudado o Cinema-Teatro Joaquim d’Almeida, no Montijo, tendo sido realizada uma análise mais geral da programação, comunicação e públicos desde a sua reabertura em 2005 como equipamento cultural municipal e uma análise mais aprofundada da última temporada do mesmo, correspondente ao ano de 2009-2010. Pretende-se assim com este trabalho salientar a importância da aplicação do marketing à cultura através da investigação do objecto de estudo supracitado e consequente análise e sugestão de estratégias para melhoria da relação entre a referida instituição e os seus públicos. O marketing revela-se assim essencial para a construção desse relacionamento, satisfazendo cada vez mais os consumidores e simultaneamente beneficiando a instituição. ABSTRACT: This thesis intends to point out the importance of the use of marketing in cultural institutions, particularly as a medium of audiences’ attraction and loyalty building. In that sense, we studied Cinema-Teatro Joaquim d’Almeida, in Montijo. We proceeded at a general analysis of the programming, communication and audiences since it opened to public as a municipal cultural infrastructure. We proceeded at a more detailed analysis of the last season, in the year 2009-2010. Then it was elaborated a theoretical investigation about Portugal’s cultural environment and the general applications of marketing at culture and services. Later we continued the analysis of the case study, regarding the documentation supplied by this cultural institution. It was also developed a marketing research about the audiences of the Theater in order to understand their general opinion about its offers and services. With this procedure, we intended to suggest a set of marketing strategies to improve the relationship between the institution and its audiences, in order to delight even more the consumers and simultaneously to benefit the institution.
Resumo:
Much has been written about where the boundaries of the firm are drawn, but little about what occurs at the boundaries themselves. When a firm subcontracts, does it inform its suppliers fully of what it requires, or is it willing to accept what they have available? In practice firms often engage in a dialogue, or conversation, with their suppliers, in which at first they set out their general requirements, and only when the supplier reports back on how these can be met are their more specific requirements set out. This paper models such conversations as a rational response to communication costs. The model is used to examine the impact of new information technology, such as CAD/CAM, on the conduct of subcontracting. It can also be used to examine its impact on the marketing activities of firms. The technique of analysis, which is based on the economic theory of teams, has more general applications too. It can be used to model all the forms of dialogue involved in the processes of coordination both within and between firms.
Resumo:
One of the most important decisions to turn a substation automatic and no attended it relates to the communication media between this substation and Operation Center. Generally energy companies uses radio or optic fiber, depending of distances and infrastructure of each situation. This rule applies to common substations. Mobile substations are a particular case, therefore they are conceived for use at provisional situations, emergencies, preventive or corrective maintenance. Thus the telecommunication solution used at common substations are not applied so easily to mobile substations, due absence of infrastructure (media) or difficulty to insert the mobile substation data in existing automation network not long. The ideal media must supply covering in a great geographic area to satisfy presented requirements. The implantation costs of this big infrastructure are expensive, however a existing operator may be used. Two services that fulfill that requirements are satellite and cellular telephony. This work presents a solution for automation of mobile substations through satellite. It was successfully implanted at a brazilian electric energy concessionaire named COSERN. The operation became transparent to operators. Other gotten benefits had been operational security, quality in the supply of electric energy and costs reduction. The project presented is a new solution, designed to substations and general applications where few data should be transmitted, but there is difficulties in relation to the media. Despite the satellite having been used, the same resulted can be gotten using celullar telephony, through Short Messages or packet networks as GPRS or EDGE.
Resumo:
All around the world, naturally occurring hydrocarbon deposits, consisting of oil and gas contained within rocks called reservoir rocks , generally sandstone or carbonate exists. These deposits are in varying conditions of pressure and depth from a few hundred to several thousand meters. In general, shallow reservoirs have greater tendency to fracture, since they have low fracture gradient, ie fractures are formed even with relatively low hydrostatic columns of fluid. These low fracture gradient areas are particularly common in onshore areas, like the Rio Grande do Norte basin. During a well drilling, one of the most favorable phases for the occurrence of fractures is during cementing, since the cement slurry used can have greater densities than the maximum allowed by the rock structure. Furthermore, in areas which are already naturally fractured, the use of regular cement slurries causes fluid loss into the formation, which may give rise to failures cementations and formation damages. Commercially, there are alternatives to the development of lightweight cement slurries, but these fail either because of their enormous cost, or because the cement properties were not good enough for most general applications, being restricted to each transaction for which the cement paste was made, or both reasons. In this work a statistical design was made to determine the influence of three variables, defined as the calcium chloride concentration, vermiculite concentration and nanosilica concentration in the various properties of the cement. The use of vermiculite, a low density ore present in large amounts in northeastern Brazil, as extensor for cementing slurries, enabled the production of stable cements, with high water/cement ratio, excellent rheological properties and low densities, which were set at 12.5 lb / gal, despite the fact that lower densities could be achieved. It is also seen that the calcium chloride is very useful as gelling and thickening agent, and their use in combination with nanosilica has a great effect on gel strength of the cement. Hydrothermal Stability studies showed that the pastes were stable in these conditions, and mechanical resistance tests showed values of the order of up to 10 MPa
Resumo:
In this work we studied the method to solving linear equations system, presented in the book titled "The nine chapters on the mathematical art", which was written in the first century of this era. This work has the intent of showing how the mathematics history can be used to motivate the introduction of some topics in high school. Through observations of patterns which repeats itself in the presented method, we were able to introduce, in a very natural way, the concept of linear equations, linear equations system, solution of linear equations, determinants and matrices, besides the Laplacian development for determinants calculations of square matrices of order bigger than 3, then considering some of their general applications
Resumo:
This work considers the reconstruction of strong gravitational lenses from their observed effects on the light distribution of background sources. After reviewing the formalism of gravitational lensing and the most common and relevant lens models, new analytical results on the elliptical power law lens are presented, including new expressions for the deflection, potential, shear and magnification, which naturally lead to a fast numerical scheme for practical calculation. The main part of the thesis investigates lens reconstruction with extended sources by means of the forward reconstruction method, in which the lenses and sources are given by parametric models. The numerical realities of the problem make it necessary to find targeted optimisations for the forward method, in order to make it feasible for general applications to modern, high resolution images. The result of these optimisations is presented in the \textsc{Lensed} algorithm. Subsequently, a number of tests for general forward reconstruction methods are created to decouple the influence of sourced from lens reconstructions, in order to objectively demonstrate the constraining power of the reconstruction. The final chapters on lens reconstruction contain two sample applications of the forward method. One is the analysis of images from a strong lensing survey. Such surveys today contain $\sim 100$ strong lenses, and much larger sample sizes are expected in the future, making it necessary to quickly and reliably analyse catalogues of lenses with a fixed model. The second application deals with the opposite situation of a single observation that is to be confronted with different lens models, where the forward method allows for natural model-building. This is demonstrated using an example reconstruction of the ``Cosmic Horseshoe''. An appendix presents an independent work on the use of weak gravitational lensing to investigate theories of modified gravity which exhibit screening in the non-linear regime of structure formation.
Resumo:
The paper provides evidence that spatial indexing structures offer faster resolution of Formal Concept Analysis queries than B-Tree/Hash methods. We show that many Formal Concept Analysis operations, computing the contingent and extent sizes as well as listing the matching objects, enjoy improved performance with the use of spatial indexing structures such as the RD-Tree. Speed improvements can vary up to eighty times faster depending on the data and query. The motivation for our study is the application of Formal Concept Analysis to Semantic File Systems. In such applications millions of formal objects must be dealt with. It has been found that spatial indexing also provides an effective indexing technique for more general purpose applications requiring scalability in Formal Concept Analysis systems. The coverage and benchmarking are presented with general applications in mind.