993 resultados para ink reduction software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tryckeribranschen är en ekonomiskt pressad bransch som söker nya besparingsmetoder. En av metoderna är att minska insatsvaran tryckfärg med färgreduceringsprogramvara. Rapporten undersöker möjligheterna med färgreduceringssystem. Detta genom att studera hur man använder sig av färgreducering och hur det påverkar trycket. Studien avser besvara: • Hur stor färgminskning kan man använda sig av utan negativa konsekvenser på bildkvalitén? • Hur går man tillväga för att skapa den färgminskningen? • Överensstämmer total färgförändring och visuell bedömning av tryck? För att få svar på dessa frågor togs en testform fram med nödvändiga bilder och färgfält som sedan genomgick en rad färgreduktioner. Testformen utvärderades digitalt med avseende på TAC och total färgförändring. Därefter trycktes testformen och utvärderades visuellt av en testgrupp och uppmättes för att visa färgförändring efter tryck. Resultatet av undersökningen visar att det går att färgreducera tryck utan avsevärda negativa konsekvenser på bildkvalitén. En reducering från 300 % TAC till en TAC mellan 240 % och 210 % är fullt möjlig för att få en besparing och vara inom standard för total färgförändring. Detta går att göra väldigt lätt med en programvara som Alwan CMYK Optimizer ECO, med enbart förvalda inställningar och en inställd Total Ink Limit mellan 240 % och 210 %. Resultatet visade även en stark korrelation mellan den visuella bedömningen och den totala färgförändring, som tyder att både metoder är lämpliga för bedömning av tryck.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new version of the TomoRebuild data reduction software package is presented, for the reconstruction of scanning transmission ion microscopy tomography (STIMT) and particle induced X-ray emission tomography (PIXET) images. First, we present a state of the art of the reconstruction codes available for ion beam microtomography. The algorithm proposed here brings several advantages. It is a portable, multi-platform code, designed in C++ with well-separated classes for easier use and evolution. Data reduction is separated in different steps and the intermediate results may be checked if necessary. Although no additional graphic library or numerical tool is required to run the program as a command line, a user friendly interface was designed in Java, as an ImageJ plugin. All experimental and reconstruction parameters may be entered either through this plugin or directly in text format files. A simple standard format is proposed for the input of experimental data. Optional graphic applications using the ROOT interface may be used separately to display and fit energy spectra. Regarding the reconstruction process, the filtered backprojection (FBP) algorithm, already present in the previous version of the code, was optimized so that it is about 10 times as fast. In addition, Maximum Likelihood Expectation Maximization (MLEM) and its accelerated version Ordered Subsets Expectation Maximization (OSEM) algorithms were implemented. A detailed user guide in English is available. A reconstruction example of experimental data from a biological sample is given. It shows the capability of the code to reduce noise in the sinograms and to deal with incomplete data, which puts a new perspective on tomography using low number of projections or limited angle.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

"HWRIC TR-012."

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The initial rate of the photocatalysed oxidation of methylene blue, MB, by dissolved oxygen in solution, ri(MB), is measured for a series of titania on glass samples exhibiting a wide range of activities.  The samples used include two different types of commercial self-cleaning glass and a lab-made sol-geltitania film.  The activities of these samples are also assessed using a resazurin-based photocatalyst activity indicator ink, i.e. Rz paii, for which the initial rates of the photocatalysed reduction of Rz were measured, ri(Rz).  A plot of ri(MB)vs. ri(Rz) reveals a goodstraight line, thereby demonstrating a linear correlation (for TiO2films on glass at least) between the slow (usually hours) photocatalysed oxidation of organic materials, such as MB, and the fast (typically minutes) photocatalysed irreversible reduction of a dye, like Rz, in a paii.  The possible use of paii technology for assessing, in a simple, quick and inexpensive manner, photocatalytic films both in the laboratory and in situ is discussed briefly.  

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Durability issues of reinforced concrete construction cost millions of dollars in repair or demolition. Identification of the causes of degradation and a prediction of service life based on experience, judgement and local knowledge has limitations in addressing all the associated issues. The objective of this CRC CI research project is to develop a tool that will assist in the interpretation of the symptoms of degradation of concrete structures, estimate residual capacity and recommend cost effective solutions. This report is a documentation of the research undertaken in connection with this project. The primary focus of this research is centred on the case studies provided by Queensland Department of Main Roads (QDMR) and Brisbane City Council (BCC). These organisations are endowed with the responsibility of managing a huge volume of bridge infrastructure in the state of Queensland, Australia. The main issue to be addressed in managing these structures is the deterioration of bridge stock leading to a reduction in service life. Other issues such as political backlash, public inconvenience, approach land acquisitions are crucial but are not within the scope of this project. It is to be noted that deterioration is accentuated by aggressive environments such as salt water, acidic or sodic soils. Carse, 2005, has noted that the road authorities need to invest their first dollars in understanding their local concretes and optimising the durability performance of structures and then look at potential remedial strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With rising environmental alarm, the reduction of critical aircraft emissions including carbon dioxides (CO2) and nitrogen oxides (NOx) is one of most important aeronautical problems. There can be many possible attempts to solve such problem by designing new wing/aircraft shape, new efficient engine, etc. The paper rather provides a set of acceptable flight plans as a first step besides replacing current aircrafts. The paper investigates a green aircraft design optimisation in terms of aircraft range, mission fuel weight (CO2) and NOx using advanced Evolutionary Algorithms coupled to flight optimisation system software. Two multi-objective design optimisations are conducted to find the best set of flight plans for current aircrafts considering discretised altitude and Mach numbers without designing aircraft shape and engine types. The objectives of first optimisation are to maximise range of aircraft while minimising NOx with constant mission fuel weight. The second optimisation considers minimisation of mission fuel weight and NOx with fixed aircraft range. Numerical results show that the method is able to capture a set of useful trade-offs that reduce NOx and CO2 (minimum mission fuel weight).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the need for concepts such as cancellation and OR-joins occurs naturally in business scenarios, comprehensive support in a workflow language is desirable. However, there is a clear trade-off between the expressive power of a language (i.e., introducing complex constructs such as cancellation and OR-joins) and ease of verification. When a workflow contains a large number of tasks and involves complex control flow dependencies, verification can take too much time or it may even be impossible. There are a number of different approaches to deal with this complexity. Reducing the size of the workflow, while preserving its essential properties with respect to a particular analysis problem, is one such approach. In this paper, we present a set of reduction rules for workflows with cancellation regions and OR-joins and demonstrate how they can be used to improve the efficiency of verification. Our results are presented in the context of the YAWL workflow language.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In large flexible software systems, bloat occurs in many forms, causing excess resource utilization and resource bottlenecks. This results in lost throughput and wasted joules. However, mitigating bloat is not easy; efforts are best applied where savings would be substantial. To aid this we develop an analytical model establishing the relation between bottleneck in resources, bloat, performance and power. Analyses with the model places into perspective results from the first experimental study of the power-performance implications of bloat. In the experiments we find that while bloat reduction can provide as much as 40% energy savings, the degree of impact depends on hardware and software characteristics. We confirm predictions from our model with selected results from our experimental study. Our findings show that a software-only view is inadequate when assessing the effects of bloat. The impact of bloat on physical resource usage and power should be understood for a full systems perspective to properly deploy bloat reduction solutions and reap their power-performance benefits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most Java programmers would agree that Java is a language that promotes a philosophy of “create and go forth”. By design, temporary objects are meant to be created on the heap, possibly used and then abandoned to be collected by the garbage collector. Excessive generation of temporary objects is termed “object churn” and is a form of software bloat that often leads to performance and memory problems. To mitigate this problem, many compiler optimizations aim at identifying objects that may be allocated on the stack. However, most such optimizations miss large opportunities for memory reuse when dealing with objects inside loops or when dealing with container objects. In this paper, we describe a novel algorithm that detects bloat caused by the creation of temporary container and String objects within a loop. Our analysis determines which objects created within a loop can be reused. Then we describe a source-to-source transformation that efficiently reuses such objects. Empirical evaluation indicates that our solution can reduce upto 40% of temporary object allocations in large programs, resulting in a performance improvement that can be as high as a 20% reduction in the run time, specifically when a program has a high churn rate or when the program is memory intensive and needs to run the GC often.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a hardware-software hybrid technique for modular multiplication over large binary fields. The technique involves application of Karatsuba-Ofman algorithm for polynomial multiplication and a novel technique for reduction. The proposed reduction technique is based on the popular repeated multiplication technique and Barrett reduction. We propose a new design of a parallel polynomial multiplier that serves as a hardware accelerator for large field multiplications. We show that the proposed reduction technique, accelerated using the modified polynomial multiplier, achieves significantly higher performance compared to a purely software technique and other hybrid techniques. We also show that the hybrid accelerated approach to modular field multiplication is significantly faster than the Montgomery algorithm based integrated multiplication approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a case base reduction technique which uses a metric defined on the solution space. The technique utilises the Generalised Shepard Nearest Neighbour (GSNN) algorithm to estimate nominal or real valued solutions in case bases with solution space metrics. An overview of GSNN and a generalised reduction technique, which subsumes some existing decremental methods, such as the Shrink algorithm, are presented. The reduction technique is given for case bases in terms of a measure of the importance of each case to the predictive power of the case base. A trial test is performed on two case bases of different kinds, with several metrics proposed in the solution space. The tests show that GSNN can out-perform standard nearest neighbour methods on this set. Further test results show that a caseremoval order proposed based on a GSNN error function can produce a sparse case base with good predictive power.