941 resultados para standard package software
Resumo:
This article shows the work developed for adapting metadata conform to the official Colombian metadata standard NTC 4611 to the international standard ISO 19115. CatMDedit, an open source metadata editor, is used in this task. CatMDedit is able of import variants of CSDGM such as NTC 4611 and export to the stable version of ISO 19139 (the XML implementation model of ISO 19115)
Resumo:
Introducción: El tratamiento con antagonistas del factor de necrosis tumoral alfa (anti TNF) ha impactado el pronóstico y la calidad de vida de los pacientes con artritis reumatoide (AR) positivamente, sin embargo, se interroga un incremento en el riesgo de desarrollar melanoma. Objetivo: Conocer la asociación entre el uso de anti TNF y el desarrollo de melanoma maligno en pacientes con AR. Metodología: Se realizó una búsqueda sistemática en MEDLINE, EMBASE, COCHRANE LIBRARY y LILACS para ensayos clínicos, estudios observacionales, revisiones y meta-análisis en pacientes adultos con diagnóstico de AR y manejo con anti TNF (Certolizumab pegol, Adalimumab, Etanercept, Infliximab y Golimumab). Resultados: 37 estudios clínicos cumplieron los criterios de inclusión para el meta-análisis, con una población de 16567 pacientes. El análisis de heterogeneidad no fue significativo (p=1), no se encontró diferencia en el riesgo entre los grupos comparados DR -0.00 (IC 95% -0.001; -0.001). Un análisis adicional de los estudios en los que se reportó al menos 1 caso de melanoma (4222 pacientes) tampoco mostró diferencia en el riesgo DR -0.00 (IC 95% -0.004 ; -0.003). Conclusión: En la evidencia disponible a la fecha no encontramos asociación significativa entre el tratamiento con anti TNF en pacientes con diagnóstico de AR y el desarrollo de melanoma cutáneo.
Resumo:
Resumen basado en el de la publicaci??n
Resumo:
When publishing information on the web, one expects it to reach all the people that could be interested in. This is mainly achieved with general purpose indexing and search engines like Google which is the most used today. In the particular case of geographic information (GI) domain, exposing content to mainstream search engines is a complex task that needs specific actions. In many occasions it is convenient to provide a web site with a specially tailored search engine. Such is the case for on-line dictionaries (wikipedia, wordreference), stores (amazon, ebay), and generally all those holding thematic databases. Due to proliferation of these engines, A9.com proposed a standard interface called OpenSearch, used by modern web browsers to manage custom search engines. Geographic information can also benefit from the use of specific search engines. We can distinguish between two main approaches in GI retrieval information efforts: Classical OGC standardization on one hand (CSW, WFS filters), which are very complex for the mainstream user, and on the other hand the neogeographer’s approach, usually in the form of specific APIs lacking a common query interface and standard geographic formats. A draft ‘geo’ extension for OpenSearch has been proposed. It adds geographic filtering for queries and recommends a set of simple standard response geographic formats, such as KML, Atom and GeoRSS. This proposal enables standardization while keeping simplicity, thus covering a wide range of use cases, in both OGC and the neogeography paradigms. In this article we will analyze the OpenSearch geo extension in detail and its use cases, demonstrating its applicability to both the SDI and the geoweb. Open source implementations will be presented as well
Resumo:
An instrument is described which carries three orthogonal geomagnetic field sensors on a standard meteorological balloon package, to sense rapid motion and position changes during ascent through the atmosphere. Because of the finite data bandwidth available over the UHF radio link, a burst sampling strategy is adopted. Bursts of 9s of measurements at 3.6Hz are interleaved with periods of slow data telemetry lasting 25s. Calculation of the variability in each channel is used to determine position changes, a method robust to periods of poor radio signals. During three balloon ascents, variability was found repeatedly at similar altitudes, simultaneously in each of three orthogonal sensors carried. This variability is attributed to atmospheric motions. It is found that the vertical sensor is least prone to stray motions, and that the use of two horizontal sensors provides no additional information over a single horizontal sensor
Resumo:
The Perspex Machine arose from the unification of computation with geometry. We now report significant redevelopment of both a partial C compiler that generates perspex programs and of a Graphical User Interface (GUI). The compiler is constructed with standard compiler-generator tools and produces both an explicit parse tree for C and an Abstract Syntax Tree (AST) that is better suited to code generation. The GUI uses a hash table and a simpler software architecture to achieve an order of magnitude speed up in processing and, consequently, an order of magnitude increase in the number of perspexes that can be manipulated in real time (now 6,000). Two perspex-machine simulators are provided, one using trans-floating-point arithmetic and the other using transrational arithmetic. All of the software described here is available on the world wide web. The compiler generates code in the neural model of the perspex. At each branch point it uses a jumper to return control to the main fibre. This has the effect of pruning out an exponentially increasing number of branching fibres, thereby greatly increasing the efficiency of perspex programs as measured by the number of neurons required to implement an algorithm. The jumpers are placed at unit distance from the main fibre and form a geometrical structure analogous to a myelin sheath in a biological neuron. Both the perspex jumper-sheath and the biological myelin-sheath share the computational function of preventing cross-over of signals to neurons that lie close to an axon. This is an example of convergence driven by similar geometrical and computational constraints in perspex and biological neurons.
Resumo:
Passive samplers have been predominantly used to monitor environmental conditions in single volumes. However, measurements using a calibrated passive sampler- Solid Phase Microextraction (SPME) fibre, in three houses with cold pitched roof, successfully demonstrated the potential of the SPME fibre as a device for monitoring air movement in two volumes. The roofs monitored were pitched at 15° - 30° with insulation thickness varying between 200-300 mm on the ceiling. For effective analysis, two constant sources of volatile organic compounds were diffused steadily in the house. Emission rates and air movement from the house to the roof was predicted using developed algorithms. The airflow rates which were calibrated against conventional tracer gas techniques were introduced into a HAM software package to predict the effects of air movement on other varying parameters. On average it was shown from the in situ measurements that about 20-30% of air entering the three houses left through gaps and cracks in the ceiling into the roof. Although these field measurements focus on the airflows, it is associated with energy benefits such that; if these flows are reduced then significantly energy losses would also be reduced (as modelled) consequently improving the energy efficiency of the house. Other results illustrated that condensation formation risks were dependent on the airtightness of the building envelopes including configurations of their roof constructions.
Resumo:
This paper deals with the integration of radial basis function (RBF) networks into the industrial software control package Connoisseur. The paper shows the improved modelling capabilities offered by RBF networks within the Connoisseur environment compared to linear modelling techniques such as recursive least squares. The paper also goes on to mention the way this improved modelling capability, obtained through the RBF networks will be utilised within Connoisseur.
Resumo:
Despite the increasing use of groupware technologies in education, there is little evidence of their impact, especially within an enquiry-based learning (EBL) context. In this paper, we examine the use of a commercial standard Group Intelligence software called GroupSystems®ThinkTank. To date, ThinkTank has been adopted mainly in the USA and supports teams in generating ideas, categorising, prioritising, voting and multi-criteria decision-making and automatically generates a report at the end of each session. The software was used by students carrying out an EBL project, set by employers, for a full academic year. The criteria for assessing the impact of ThinkTank on student learning were those of creativity, participation, productivity, engagement and understanding. Data was collected throughout the year using a combination of interviews and questionnaires, and written feedback from employers. The overall findings show an increase in levels of productivity and creativity, evidence of a deeper understanding of their work but some variation in attitudes towards participation in the early stages of the project.
Resumo:
In this study, the performance, yield and characteristics of a 16 year old photovoltaic (PV) system installation have been investigated. The technology, BP Saturn modules which were steel-blue polycrystalline silicon cells are no longer in production. A bespoke monitoring system has been designed to monitor the characteristics of 6 refurbished strings, of 18 modules connected in series. The total output of the system is configured to 6.5 kWp (series to parallel configuration). In addition to experimental results, the performance ratio (PR) of known values was simulated using PVSyst, a simulation software package. From calculations using experimental values, the PV system showed approximately 10% inferior power outputs to what would have been expected as standard test conditions. However, efficiency values in comparison to standard test conditions and the performance ratio (w75% from PVSyst simulations) over the past decade have remained practically the same. This output though very relevant to the possible performance and stability of aging cells, requires additional parametric studies to develop a more robust argument. The result presented in this paper is part of an on-going investigation into PV system aging effects.
Resumo:
Adequate initial configurations for molecular dynamics simulations consist of arrangements of molecules distributed in space in such a way to approximately represent the system`s overall structure. In order that the simulations are not disrupted by large van der Waals repulsive interactions, atoms from different molecules Must keep safe pairwise distances. Obtaining Such a molecular arrangement can be considered it packing problem: Each type molecule must satisfy spatial constraints related to the geometry of the system, and the distance between atoms of different molecules Must be greater than some specified tolerance. We have developed a code able to pack millions of atoms. grouped in arbitrarily complex molecules, inside a variety of three-dimensional regions. The regions may be intersections of spheres, ellipses, cylinders, planes, or boxes. The user must provide only the structure of one molecule of each type and the geometrical constraints that each type of molecule must satisfy. Building complex mixtures, interfaces, solvating biomolecules in water, other solvents, or mixtures of solvents, is straight forward. In addition. different atoms belonging to the same molecule may also be restricted to different spatial regions, in Such a way that more ordered molecular arrangements call be built, as micelles. lipid double-layers, etc. The packing time for state-of-the-art molecular dynamics systems varies front a few seconds to a few Minutes in a personal Computer. The input files are simple and Currently compatible with PDB, Tinker, Molden, or Moldy coordinate files. The package is distributed as free software and call be downloaded front http://www.ime.unicamp.br/similar to martinez/packmol/. (C) 2009 Wiley Periodicals. Inc. J Comput Chem 30: 2157-2164, 2009
Resumo:
An abnormality in neurodevelopment is one of the most robust etiologic hypotheses in schizophrenia (SZ). There is also strong evidence that genetic factors may influence abnormal neurodevelopment in the disease. The present study evaluated in SZ patients, whose brain structural data had been obtained with magnetic resonance imaging (MRI), the possible association between structural brain measures, and 32 DNA polymorphisms,located in 30 genes related to neurogenesis and brain development. DNA was extracted from peripheral blood cells of 25 patients with schizophrenia, genotyping was performed using diverse procedures, and putative associations were evaluated by standard statistical methods (using the software Statistical Package for Social Sciences - SPSS) with a modified Bonferroni adjustment. For reelin (RELN), a protease that guides neurons in the developing brain and underlies neurotransmission and synaptic plasticity in adults, an association was found for a non-synonymous polymorphism (Va1997Leu) with left and right ventricular enlargement. A putative association was also found between protocadherin 12 (PCDH12), a cell adhesion molecule involved in axonal guidance and synaptic specificity, and cortical folding (asymmetry coefficient of gyrification index). Although our results are preliminary, due to the small number of individuals analyzed, such an approach could reveal new candidate genes implicated in anomalous neurodevelopment in schizophrenia. (c) 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Background qtl.outbred is an extendible interface in the statistical environment, R, for combining quantitative trait loci (QTL) mapping tools. It is built as an umbrella package that enables outbred genotype probabilities to be calculated and/or imported into the software package R/qtl. Findings Using qtl.outbred, the genotype probabilities from outbred line cross data can be calculated by interfacing with a new and efficient algorithm developed for analyzing arbitrarily large datasets (included in the package) or imported from other sources such as the web-based tool, GridQTL. Conclusion qtl.outbred will improve the speed for calculating probabilities and the ability to analyse large future datasets. This package enables the user to analyse outbred line cross data accurately, but with similar effort than inbred line cross data.
Resumo:
As Brazil wants to be perceived as a competitor in providing computer applications and system development services in the global market, the concept of Software Factory gains importance. The metaphor for the 'Factory', when applied to the activity of software development, is used to describe organizations which produce software with a minimum quality standard and at competitive costs. However, the term 'Factory' recalls Fordist concepts, which have been challenged for a few decades in the manufacturing industry. This study analyzed university curricula and how students and teachers perceive the concept of Software Factory and assessed them in relation to the Fordism ------------ post-Fordism /continuum/. It was observed that some of the teachers who have influence over curricula define Software Factories according to Fordist concepts. It was also observed that, despite opportunities for improvements, curricula are adequately structured with regards to the skills a professional at these organizations must possess. We conclude that education provided at the programs being analyzed is adequate, but that it must be supplemented by companies or by the professionals themselves so that the knowledge acquired in the programs may be put in practice.
Resumo:
Generalized hyper competitiveness in the world markets has determined the need to offer better products to potential and actual clients in order to mark an advantagefrom other competitors. To ensure the production of an adequate product, enterprises need to work on the efficiency and efficacy of their business processes (BPs) by means of the construction of Interactive Information Systems (IISs, including Interactive Multimedia Documents) so that they are processed more fluidly and correctly.The construction of the correct IIS is a major task that can only be successful if the needs from every intervenient are taken into account. Their requirements must bedefined with precision, extensively analyzed and consequently the system must be accurately designed in order to minimize implementation problems so that the IIS isproduced on schedule and with the fewer mistakes as possible. The main contribution of this thesis is the proposal of Goals, a software (engineering) construction process which aims at defining the tasks to be carried out in order to develop software. This process defines the stakeholders, the artifacts, and the techniques that should be applied to achieve correctness of the IIS. Complementarily, this process suggests two methodologies to be applied in the initial phases of the lifecycle of the Software Engineering process: Process Use Cases for the phase of requirements, and; MultiGoals for the phases of analysis and design. Process Use Cases is a UML-based (Unified Modeling Language), goal-driven and use case oriented methodology for the definition of functional requirements. It uses an information oriented strategy in order to identify BPs while constructing the enterprise’s information structure, and finalizes with the identification of use cases within the design of these BPs. This approach provides a useful tool for both activities of Business Process Management and Software Engineering. MultiGoals is a UML-based, use case-driven and architectural centric methodology for the analysis and design of IISs with support for Multimedia. It proposes the analysis of user tasks as the basis of the design of the: (i) user interface; (ii) the system behaviour that is modeled by means of patterns which can combine Multimedia and standard information, and; (iii) the database and media contents. This thesis makes the theoretic presentation of these approaches accompanied with examples from a real project which provide the necessary support for the understanding of the used techniques.