949 resultados para XML optimisation
Resumo:
Data traffic caused by mobile advertising client software when it is communicating with the network server can be a pain point for many application developers who are considering advertising-funded application distribution, since the cost of the data transfer might scare their users away from using the applications. For the thesis project, a simulation environment was built to mimic the real client-server solution for measuring the data transfer over varying types of connections with different usage scenarios. For optimising data transfer, a few general-purpose compressors and XML-specific compressors were tried for compressing the XML data, and a few protocol optimisations were implemented. For optimising the cost, cache usage was improved and pre-loading was enhanced to use free connections to load the data. The data traffic structure and the various optimisations were analysed, and it was found that the cache usage and pre-loading should be enhanced and that the protocol should be changed, with report aggregation and compression using WBXML or gzip.
Resumo:
Originally from Asia, Dovyalis hebecarpa is a dark purple/red exotic berry now also produced in Brazil. However, no reports were found in the literature about phenolic extraction or characterisation of this berry. In this study we evaluate the extraction optimisation of anthocyanins and total phenolics in D. hebecarpa berries aiming at the development of a simple and mild analytical technique. Multivariate analysis was used to optimise the extraction variables (ethanol:water:acetone solvent proportions, times, and acid concentrations) at different levels. Acetone/water (20/80 v/v) gave the highest anthocyanin extraction yield, but pure water and different proportions of acetone/water or acetone/ethanol/water (with >50% of water) were also effective. Neither acid concentration nor time had a significant effect on extraction efficiency allowing to fix the recommended parameters at the lowest values tested (0.35% formic acid v/v, and 17.6 min). Under optimised conditions, extraction efficiencies were increased by 31.5% and 11% for anthocyanin and total phenolics, respectively as compared to traditional methods that use more solvent and time. Thus, the optimised methodology increased yields being less hazardous and time consuming than traditional methods. Finally, freeze-dried D. hebecarpa showed high content of target phytochemicals (319 mg/100g and 1,421 mg/100g of total anthocyanin and total phenolic content, respectively).
Resumo:
Research of advanced technologies for energy generation contemplates a series of alternatives that are introduced both in the investigation of new energy sources and in the improvement and/or development of new components and systems. Even though significant reductions are observed in the amount of emissions, the proposed alternatives require the use of exhaust gases cleaning systems. The results of environmental analyses based on two configurations proposed for urban waste incineration are presented in this paper; the annexation of integer (Boolean) variables to the environomic model makes it possible to define the best gas cleaning routes based on exergetic cost minimisation criteria. In this first part, the results for steam cogeneration system analysis associated with the incineration of municipal solid wastes (MSW) is presented. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
In the first paper of this paper (Part I), conditions were presented for the gas cleaning technological route for environomic optimisation of a cogeneration system based in a thermal cycle with municipal solid waste incineration. In this second part, an environomic analysis is presented of a cogeneration system comprising a combined cycle composed of a gas cycle burning natural gas with a heat recovery steam generator with no supplementary burning and a steam cycle burning municipal solid wastes (MSW) to which will be added a pure back pressure steam turbine (another one) of pure condensation. This analysis aims to select, concerning some scenarios, the best atmospheric pollutant emission control routes (rc) according to the investment cost minimisation, operation and social damage criteria. In this study, a comparison is also performed with the results obtained in the Case Study presented in Part I. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The objective of this study was to develop a dessert that contains soy protein (SP) (1%, 2%, 3%) and guava juice (GJ) (22%, 27%, 32%) using Response Surface Methodology (RSM) as the optimisation technique. Water activity, physical stability, colour, acidity, pH, iron, and carotenoid contents were analysed. Affective tests were performed to determine the degree of liking of colour, creaminess, and acceptability. The results showed that GJ increased the values of redness, hue angle, chromaticity, acidity, and carotenoid content, while SP reduced water activity. Optimisation suggested a dessert containing 32% GJ and 1.17% SP as the best proportion of these components. This sample was considered a source of fibres, ascorbic acid, copper, and iron and garnered scores above the level of `slightly liked` for sensory attributes. Moreover, RSM was shown to be an adequate approach for modelling the physicochemical parameters and the degree of liking of creaminess of desserts. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Numerical optimisation methods are being more commonly applied to agricultural systems models, to identify the most profitable management strategies. The available optimisation algorithms are reviewed and compared, with literature and our studies identifying evolutionary algorithms (including genetic algorithms) as superior in this regard to simulated annealing, tabu search, hill-climbing, and direct-search methods. Results of a complex beef property optimisation, using a real-value genetic algorithm, are presented. The relative contributions of the range of operational options and parameters of this method are discussed, and general recommendations listed to assist practitioners applying evolutionary algorithms to the solution of agricultural systems. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
The activated sludge comprises a complex microbiological community. The structure (what types of microorganisms are present) and function (what can the organisms do and at what rates) of this community are determined by external physico -chemical features and by the influent to the sewage treatment plant. The external features we can manipulate but rarely the influent. Conventional control and operational strategies optimise activated sludge processes more as a chemical system than as a biological one. While optimising the process in a short time period, these strategies may deteriorate the long-term performance of the process due to their potentially adverse impact on the microbial properties. Through briefly reviewing the evidence available in the literature that plant design and operation affect both the structure and function of the microbial community in activated sludge, we propose to add sludge population optimisation as a new dimension to the control of biological wastewater treatment systems. We stress that optimising the microbial community structure and property should be an explicit aim for the design and operation of a treatment plant. The major limitations to sludge population optimisation revolve around inadequate microbiological data, specifically community structure, function and kinetic data. However, molecular microbiological methods that strive to provide that data are being developed rapidly. The combination of these methods with the conventional approaches for kinetic study is briefly discussed. The most pressing research questions pertaining to sludge population optimisation are outlined. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
One of the most difficult problems that face researchers experimenting with complex systems in real world applications is the Facility Layout Design Problem. It relies with the design and location of production lines, machinery and equipment, inventory storage and shipping facilities. In this work it is intended to address this problem through the use of Constraint Logic Programming (CLP) technology. The use of Genetic Algorithms (GA) as optimisation technique in CLP environment is also an issue addressed. The approach aims the implementation of genetic algorithm operators following the CLP paradigm.
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.
Resumo:
An accurate and sensitive method for determination of 18 polycyclic aromatic hydrocarbons (PAHs) (16 PAHs considered by USEPA as priority pollutants, dibenzo[a,l]pyrene and benzo[j]fluoranthene) in fish samples was validated. Analysis was performed by microwave-assisted extraction and liquid chromatography with photodiode array and fluorescence detection. Response surface methodology was used to find the optimal extraction parameters. Validation of the overall methodology was performed by spiking assays at four levels and using SRM 2977. Quantification limits ranging from 0.15–27.16 ng/g wet weight were obtained. The established method was applied in edible tissues of three commonly consumed and commercially valuable fish species (sardine, chub mackerel and horse mackerel) originated from Atlantic Ocean. Variable levels of naphthalene (1.03–2.95 ng/g wet weight), fluorene (0.34–1.09 ng/g wet weight) and phenanthrene (0.34–3.54 ng/g wet weight) were detected in the analysed samples. None of the samples contained detectable amounts of benzo[a]pyrene, the marker used for evaluating the occurrence and carcinogenic effects of PAHs in food.
Resumo:
Medical imaging is a powerful diagnostic tool. Consequently, the number of medical images taken has increased vastly over the past few decades. The most common medical imaging techniques use X-radiation as the primary investigative tool. The main limitation of using X-radiation is associated with the risk of developing cancers. Alongside this, technology has advanced and more centres now use CT scanners; these can incur significant radiation burdens compared with traditional X-ray imaging systems. The net effect is that the population radiation burden is rising steadily. Risk arising from X-radiation for diagnostic medical purposes needs minimising and one way to achieve this is through reducing radiation dose whilst optimising image quality. All ages are affected by risk from X-radiation however the increasing population age highlights the elderly as a new group that may require consideration. Of greatest concern are paediatric patients: firstly they are more sensitive to radiation; secondly their younger age means that the potential detriment to this group is greater. Containment of radiation exposure falls to a number of professionals within medical fields, from those who request imaging to those who produce the image. These staff are supported in their radiation protection role by engineers, physicists and technicians. It is important to realise that radiation protection is currently a major European focus of interest and minimum competence levels in radiation protection for radiographers have been defined through the integrated activities of the EU consortium called MEDRAPET. The outcomes of this project have been used by the European Federation of Radiographer Societies to describe the European Qualifications Framework levels for radiographers in radiation protection. Though variations exist between European countries radiographers and nuclear medicine technologists are normally the professional groups who are responsible for exposing screening populations and patients to X-radiation. As part of their training they learn fundamental principles of radiation protection and theoretical and practical approaches to dose minimisation. However dose minimisation is complex – it is not simply about reducing X-radiation without taking into account major contextual factors. These factors relate to the real world of clinical imaging and include the need to measure clinical image quality and lesion visibility when applying X-radiation dose reduction strategies. This requires the use of validated psychological and physics techniques to measure clinical image quality and lesion perceptibility.