925 resultados para Fix and optimize
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
The goal of FOCUS, which stands for Frailty Management Optimization through EIPAHA Commitments and Utilization of Stakeholders’ Input, is to reduce the burden of frailty in Europe. The partners are working on advancing knowledge of frailty detection, assessment, and management, including biological, clinical, cognitive and psychosocial markers, in order to change the paradigm of frailty care from acute intervention to prevention. FOCUS partners are working on ways to integrate the best available evidence from frailty-related screening tools, epidemiological and interventional studies into the care of frail people and their quality of life. Frail citizens in Italy, Poland and the UK and their caregivers are being called to express their views and their experiences with treatments and interventions aimed at improving quality of life. The FOCUS Consortium is developing pathways to leverage the knowledge available and to put it in the service of frail citizens. In order to reach out to the broadest audience possible, the FOCUS Platform for Knowledge Exchange and the platform for Scaling Up are being developed with the collaboration of stakeholders. The FOCUS project is a development of the work being done by the European Innovation Partnership on Active and Healthy Ageing (EIPAHA), which aims to increase the average healthy lifespan in Europe by 2020 while fostering sustainability of health/social care systems and innovation in Europe. The knowledge and tools developed by the FOCUS project, with input from stakeholders, will be deployed to all EIPAHA participants dealing with frail older citizens to support activities and optimize performance.
Resumo:
Metadata that is associated with either an information system or an information object for purposes of description, administration, legal requirements, technical functionality, use and usage, and preservation, plays a critical role in ensuring the creation, management, preservation and use and re-use of trustworthymaterials, including records. Recordkeeping1 metadata, of which one key type is archival description, plays a particularly important role in documenting the reliability and authenticity of records and recordkeeping systemsas well as the various contexts (legal-administrative, provenancial, procedural, documentary, and technical) within which records are created and kept as they move across space and time. In the digital environment, metadata is also the means by which it is possible to identify how record components – those constituent aspects of a digital record that may be managed, stored and used separately by the creator or the preserver – can be reassembled to generate an authentic copy of a record or reformulated per a user’s request as a customized output package.Issues relating to the creation, capture, management and preservation of adequate metadata are, therefore, integral to any research study addressing the reliability and authenticity of digital entities, regardless of the community, sector or institution within which they are being created. The InterPARES 2 Description Cross-Domain Group (DCD) examined the conceptualization, definitions, roles, and current functionality of metadata and archival description in terms of requirements generated by InterPARES 12. Because of the needs to communicate the work of InterPARES in a meaningful way across not only other disciplines, but also different archival traditions; to interface with, evaluate and inform existing standards, practices and other research projects; and to ensure interoperability across the three focus areas of InterPARES2, the Description Cross-Domain also addressed its research goals with reference to wider thinking about and developments in recordkeeping and metadata. InterPARES2 addressed not only records, however, but a range of digital information objects (referred to as “entities” by InterPARES 2, but not to be confused with the term “entities” as used in metadata and database applications) that are the products and by-products of government, scientific and artistic activities that are carried out using dynamic, interactive or experiential digital systems. The nature of these entities was determined through a diplomatic analysis undertaken as part of extensive case studies of digital systems that were conducted by the InterPARES 2 Focus Groups. This diplomatic analysis established whether the entities identified during the case studies were records, non-records that nevertheless raised important concerns relating to reliability and authenticity, or “potential records.” To be determined to be records, the entities had to meet the criteria outlined by archival theory – they had to have a fixed documentary format and stable content. It was not sufficient that they be considered to be or treated as records by the creator. “Potential records” is a new construct that indicates that a digital system has the potential to create records upon demand, but does not actually fix and set aside records in the normal course of business. The work of the Description Cross-Domain Group, therefore, addresses the metadata needs for all three categories of entities.Finally, since “metadata” as a term is used today so ubiquitously and in so many different ways by different communities, that it is in peril of losing any specificity, part of the work of the DCD sought to name and type categories of metadata. It also addressed incentives for creators to generate appropriate metadata, as well as issues associated with the retention, maintenance and eventual disposition of the metadata that aggregates around digital entities over time.
Resumo:
In Europe, the concerns with the status of marine ecosystems have increased, and the Marine Directive has as main goal the achievement of Good Environmental Status (GES) of EU marine waters by 2020. Molecular tools are seen as promising and emerging approaches to improve ecosystem monitoring, and have led ecology into a new era, representing perhaps the most source of innovation in marine monitoring techniques. Benthic nematodes are considered ideal organisms to be used as biological indicator of natural and anthropogenic disturbances in aquatic ecosystems underpinning monitoring programmes on the ecological quality of marine ecosystems, very useful to assess the GES of the marine environment. dT-RFLP (directed Terminal-Restriction Fragment Length Polymorphism) allows to assess the diversity of nematode communities, but also allows studying the functioning of the ecosystem, and combined with relative real-time PCR (qPCR), provides a high-throughput semi-quantitative characterization of nematode communities. These characteristics make the two molecular tools good descriptors for the good environmental status assessment. The main aim of this study is to develop and optimize the dT-RFLP and qPCR in Mira estuary (SW coast, Portugal). A molecular phylogenetic analysis of marine and estuarine nematodes is being performed combining morphological and molecular analysis to evaluate the diversity of free-living marine nematodes in Mira estuary. After morphological identification, barcoding of 18S rDNA and COI genes are being determined for each nematode species morphologically identified. So far we generated 40 new sequences belonging to 32 different genus and 17 families, and the study has shown a good degree of concordance between traditional morphology-based identification and DNA sequences. These results will improve the assessment of marine nematode diversity and contribute to a more robust nematode taxonomy. The DNA sequences are being used to develop the dT-RFLP with the ability to easily process large sample numbers (hundreds and thousands), rather than typical of classical taxonomic or low throughput molecular analyses. A preliminary study showed that the digest enzymes used in dT-RFLP for terrestrial assemblages separated poorly the marine nematodes at taxonomic level for functional group analysis. A new digest combination was designed using the software tool DRAT (Directed Terminal Restriction Analysis Tool) to distinguished marine nematode taxa. Several solutions were provided by DRAT and tested empirically to select the solution that cuts most efficiently. A combination of three enzymes and a single digest showed to be the best solution to separate the different clusters. Parallel to this, another tool is being developed to estimate the population size (qPCR). An improvement in qPCR estimation of gene copy number using an artificial reference is being performed for marine nematodes communities to quantify the abundance. Once developed, it is proposed to validate both methodologies by determining the spatial and temporal variability of benthic nematodes assemblages across different environments. The application of these high-throughput molecular approaches for benthic nematodes will improve sample throughput and their implementation more efficient and faster as indicator of ecological status of marine ecosystems.
Resumo:
In the last decades, global food supply chains had to deal with the increasing awareness of the stakeholders and consumers about safety, quality, and sustainability. In order to address these new challenges for food supply chain systems, an integrated approach to design, control, and optimize product life cycle is required. Therefore, it is essential to introduce new models, methods, and decision-support platforms tailored to perishable products. This thesis aims to provide novel practice-ready decision-support models and methods to optimize the logistics of food items with an integrated and interdisciplinary approach. It proposes a comprehensive review of the main peculiarities of perishable products and the environmental stresses accelerating their quality decay. Then, it focuses on top-down strategies to optimize the supply chain system from the strategical to the operational decision level. Based on the criticality of the environmental conditions, the dissertation evaluates the main long-term logistics investment strategies to preserve products quality. Several models and methods are proposed to optimize the logistics decisions to enhance the sustainability of the supply chain system while guaranteeing adequate food preservation. The models and methods proposed in this dissertation promote a climate-driven approach integrating climate conditions and their consequences on the quality decay of products in innovative models supporting the logistics decisions. Given the uncertain nature of the environmental stresses affecting the product life cycle, an original stochastic model and solving method are proposed to support practitioners in controlling and optimizing the supply chain systems when facing uncertain scenarios. The application of the proposed decision-support methods to real case studies proved their effectiveness in increasing the sustainability of the perishable product life cycle. The dissertation also presents an industry application of a global food supply chain system, further demonstrating how the proposed models and tools can be integrated to provide significant savings and sustainability improvements.
Resumo:
The principal aim of studies of enzyme-mediated reactions has been to provide comparative and quantitative information on enzyme-catalyzed reactions under distinct conditions. The classic Michaelis-Menten model (Biochem Zeit 49:333, 1913) for enzyme kinetic has been widely used to determine important parameters involved in enzyme catalysis, particularly the Michaelis-Menten constant (K (M) ) and the maximum velocity of reaction (V (max) ). Subsequently, a detailed treatment of the mechanisms of enzyme catalysis was undertaken by Briggs-Haldane (Biochem J 19:338, 1925). These authors proposed the steady-state treatment, since its applicability was constrained to this condition. The present work describes an extending solution of the Michaelis-Menten model without the need for such a steady-state restriction. We provide the first analysis of all of the individual reaction constants calculated analytically. Using this approach, it is possible to accurately predict the results under new experimental conditions and to characterize and optimize industrial processes in the fields of chemical and food engineering, pharmaceuticals and biotechnology.
Resumo:
Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.
Resumo:
Conventional threading operations involve two distinct machining processes: drilling and threading. Therefore, it is time consuming for the tools must be changed and the workpiece has to be moved to another machine. This paper presents an analysis of the combined process (drilling followed by threading) using a single tool for both operations: the tap-milling tool. Before presenting the methodology used to evaluate this hybrid tool, the ODS (operating deflection shapes) basics is shortly described. ODS and finite element modeling (FEM) were used during this research to optimize the process aiming to achieve higher stable machining conditions and increasing the tool life. Both methods allowed the determination of the natural frequencies and displacements of the machining center and optimize the workpiece fixture system. The results showed that there is an excellent correlation between the dynamic stability of the machining center-tool holder and the tool life, avoiding a tool premature catastrophic failure. Nevertheless, evidence showed that the tool is very sensitive to work conditions. Undoubtedly, the use of ODS and FEM eliminate empiric decisions concerning the optimization of machining conditions and increase drastically the tool life. After the ODS and FEM studies, it was possible to optimize the process and work material fixture system and machine more than 30,000 threaded holes without reaching the tool life limit and catastrophic fail.
Resumo:
In petroleum refineries, water is used in desalting units to remove the salt contained in crude oil. Typically, 7% of the volume of hot crude oil is water, forming a water-and-oil emulsion. The emulsion flows between two electrodes and is subjected to an electric field. The electrical forces promote the coalescence of small droplets of water dispersed in crude oil, and these form bigger droplets. This paper calculates the forces acting on the droplets, highlighting particularly the mechanisms proposed for droplet-droplet coalescence under the influence of an applied electric field. Moreover, a model is developed in order to calculate the displacement speed of the droplets and the time between droplet collisions. Thus, it is possible to simulate and optimize the process by changing the operational variables (temperature, electrical field, and water quantity). The main advantage of this study is to show that it is feasible to increase the volume of water recycled in desalting processes, thus reducing the use of freshwater and the generation of liquid effluents in refineries.
Resumo:
One major challenge for the widespread application of direct methanol fuel cells (DMFCs) is to decrease the amount of platinum used in the electrodes, which has motivated a search for novel electrodes containing platinum nanoparticles. In this study, platinum nanoparticles were electrodeposited on layer-by-layer (LbL) films from TiO(2) and poly(vinyl sulfonic) (PVS), by immersing the films into a H(2)PtCl(6) solution and applying a 100 mu A current during different electrode position times. Scanning tunnel microscopy (STM) and atomic force microscopy (AFM) images showed increased platinum particle size and electrode roughness for increasing electrodeposition times. The potentiodynamic profile of the electrodes indicated that oxygen-like species in 0.5 mol L(-1) H(2)SO(4) were formed at less positive potentials for the smallest platinum particles. Electrochemical impedance spectroscopy measurements confirmed the high reactivity for the water dissociation and the large amount of oxygen-like species adsorbed on the smallest platinum nanoparticles. This high oxophilicity of the smallest nanoparticles was responsible for the electrocatalytic activity of Pt-TiO(2)/PVS systems for methanol electrooxidation, according to the Langmuir-Hinshelwood bifunctional mechanism. Significantly, the approach used here combining platinum electrodeposition and LbL matrices allows one to both control the particle size and optimize methanol electrooxidation, being therefore promising for producing membrane-electrode assemblies of DMFCs.
Resumo:
Background: This study evaluated mechanical properties of glass ionomer cements (GICs) used for atraumatic restorative treatment. Wear resistance, Knoop hardness (Kh), flexural (F(s)) and compressive strength (C(s)) were evaluated. The GICs used were Riva Self Cure (RVA), Fuji IX (FIX), Hi Dense (HD), Vitro Molar (VM), Maxxion R (MXR) and Ketac Molar Easymix (KME). Methods: Wear was evaluated after 1, 4, 63 and 365 days. Two-way ANOVA and Tukey post hoc tests (P = 0.05) analysed differences in wear of the GICs and the time effect. F(s), C(s), and Kh were analysed with one-way ANOVA. Results: The type of cement (p < 0.001) and the time (p < 0.001) had a significant effect on wear. In early-term wear and Kh, KME and FIX presented the best performance. In long-term wear, F(s) and C(s), KME, FIX and HD had the best performance. Strong explanatory power between F(s) and the Kh (r(2) = 0.85), C(s) and the Kh (r(2) = 0.82), long-term wear and F(s) of 24 h (r(2) = 0.79) were observed. Conclusions: The data suggested that KME and FIX presented the best in vitro performance. HD showed good results except for early-term wear.
Resumo:
Large chemical libraries can be synthesized on solid-support beads by the combinatorial split-and-mix method. A major challenge associated with this type of library synthesis is distinguishing between the beads and their attached compounds. A new method of encoding these solid-support beads, 'colloidal bar-coding', involves attaching fluorescent silica colloids ('reporters') to the beads as they pass through the compound synthesis, thereby creating a fluorescent bar code on each bead. In order to obtain sufficient reporter varieties to bar code extremely large libraries, many of the reporters must contain multiple fluorescent dyes. We describe here the synthesis and spectroscopic analysis of various mono- and multi-fluorescent silica particles for this purpose. It was found that by increasing the amount of a single dye introduced into the particle reaction mixture, mono- fluorescent silica particles of increasing intensities could be prepared. This increase was highly reproducible and was observed for six different fluorescent dyes. Multi-fluorescent silica particles containing up to six fluorescent dyes were also prepared. The resultant emission intensity of each dye in the multi-fluorescent particles was found to be dependent upon a number of factors; the hydrolysis rate of each silane-dye conjugate, the magnitude of the inherent emission intensity of each dye within the silica matrix, and energy transfer effects between dyes. We show that by varying the relative concentration of each silane-dye conjugate in the synthesis of multi-fluorescent particles, it is possible to change and optimize the resultant emission intensity of each dye to enable viewing in a fluorescence detection instrument.
Resumo:
HLA-A*0201 transgenic, H-2D(b)/mouse beta2-microglobulin double-knockout mice were used to compare and optimize the immunogenic potential of 17HIV 1-derived, HLA-A0201-restricted epitopic peptides. A tyrosine substitution in position 1 of the epitopic peptides, which increases both their affinity for and their HLA-A0201 molecule stabilizing capacity, was introduced in a significant proportion, having verified that such modifications enhance their immunogenicity in respect of their natural antigenicity. Based on these results, a 13-polyepitope construct was inserted in the pre-S2 segment of the hepatitis B middle glycoprotein and used for DNA immunization. Long-lasting CTL responses against most of the inserted epitopes could be elicited simultaneously in a single animal with cross-recognition in several cases of their most common natural variants.
Resumo:
Por meio da discuss??o cr??tica dos principais conceitos, o texto explora as contribui????es que a operacionaliza????o de capital social poderia aportar ??s pol??ticas p??blicas. H?? uma rede que pode ser fortalecida ou mesmo criada visando ao empoderamento das pessoas para que possam interferir nas decis??es p??blicas, melhorar a qualidade de vida e otimizar os efeitos das pol??ticas p??blicas. Esse potencial vem sendo ressaltado em ??reas como desenvolvimento social, mercado de trabalho, integra????o de imigrantes, multiculturalismo e diversidade, juventude, preven????o de crimes, sa??de, comunidades ind??genas e participa????o c??vica.
Resumo:
Os sistemas de armas da Força Aérea Portuguesa (FAP) têm por missão a defesa militar de Portugal, através de operações aéreas e da defesa do espaço aéreo nacional, sendo o F-16 o principal avião de ataque em uso nesta organização. Neste sentido, e tendo em conta o actual contexto económico mundial, as organizações devem rentabilizar todos os recursos disponíveis, custos associados e optimizar processos de trabalho. Tendo por base os pressupostos anteriores, o presente estudo pretende analisar a implementação de lean na FAP, uma vez que esta filosofia assenta na eliminação de desperdícios com vista a uma melhoria da qualidade e diminuição de tempos e custos. Posto isto, a análise deste trabalho vai recair sobre a área de manutenção do F-16, em concreto na Inspeção de Fase (IF), um tipo de manutenção que esta aeronave realiza a cada trezentas horas de voo. O estudo de caso vai incidir em dois momentos da IF: o primeiro ponto relaciona-se com o processamento da recolha de dados para a reunião preliminar onde são definidas, para as áreas de trabalho executantes, as ações de manutenção a realizar com a paragem da aeronave. Deste modo, pretende-se averiguar as causas inerentes aos atrasos verificados para a realização desta reunião. O segundo ponto em observação compreende a informação obtida através da aplicação informática SIAGFA, em uso na FAP, para o processamento de dados de manutenção das quatro aeronaves que inauguraram a IF com a filosofia lean. Esta análise permitiu perceber o número de horas de trabalho dispendidas (em média pelas quatro aeronaves) por cada uma das cartas de trabalho, verificando-se que as cartas adicionais comportam mais horas; foi possível compreender quais as áreas de trabalho consideradas críticas; foram identificados os dias de trabalho realizado e tempos de paragem sem qualquer tipo de intervenção. Foi ainda avaliado, por aeronave, o número de horas de trabalho realizadas na IF e quais os constrangimentos que se verificaram nas aeronaves, que não realizaram a IF no tempo definido para tal.