993 resultados para ink reduction software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-thermal plasma (NTP) is a promising candidate for controlling engine exhaust emissions. Plasma is known as the fourth state of matter, where both electrons and positive ions co-exist. Both gaseous and particle emissions of diesel exhaust undergo chemical changes when they are exposed to plasma. In this project diesel particulate matter (DPM) mitigation from the actual diesel exhaust by using NTP technology has been studied. The effect of plasma, not only on PM mass but also on PM size distribution, physico-chemical structure of PM and PM removal mechanisms, has been investigated. It was found that NTP technology can significantly reduce both PM mass and number. However, under some circumstances particles can be formed by nucleation. Energy required to create the plasma with the current technology is higher than the benchmark set by the commonly used by the automotive industry. Further research will enable the mechanism of particle creation and energy consumption to be optimised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multidimensional data are getting increasing attention from researchers for creating better recommender systems in recent years. Additional metadata provides algorithms with more details for better understanding the interaction between users and items. While neighbourhood-based Collaborative Filtering (CF) approaches and latent factor models tackle this task in various ways effectively, they only utilize different partial structures of data. In this paper, we seek to delve into different types of relations in data and to understand the interaction between users and items more holistically. We propose a generic multidimensional CF fusion approach for top-N item recommendations. The proposed approach is capable of incorporating not only localized relations of user-user and item-item but also latent interaction between all dimensions of the data. Experimental results show significant improvements by the proposed approach in terms of recommendation accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

User profiling is the process of constructing user models which represent personal characteristics and preferences of customers. User profiles play a central role in many recommender systems. Recommender systems recommend items to users based on user profiles, in which the items can be any objects which the users are interested in, such as documents, web pages, books, movies, etc. In recent years, multidimensional data are getting more and more attention for creating better recommender systems from both academia and industry. Additional metadata provides algorithms with more details for better understanding the interactions between users and items. However, most of the existing user/item profiling techniques for multidimensional data analyze data through splitting the multidimensional relations, which causes information loss of the multidimensionality. In this paper, we propose a user profiling approach using a tensor reduction algorithm, which we will show is based on a Tucker2 model. The proposed profiling approach incorporates latent interactions between all dimensions into user profiles, which significantly benefits the quality of neighborhood formation. We further propose to integrate the profiling approach into neighborhoodbased collaborative filtering recommender algorithms. Experimental results show significant improvements in terms of recommendation accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of microfinance in Vietnam since 1990s has coincided with a remarkable progress in poverty reduction. Numerous descriptive studies have illustrated that microfinance is an effective tool to eradicate poverty in Vietnam but evidence from quantitative studies is mixed. This study contributes to the literature by providing new evidence on the impact of microfinance to poverty reduction in Vietnam using the repeated cross - sectional data from the Vietnam Living Standard s Survey (VLSS) during period 1992 - 2010. Our results show that micro - loans contribute significantly to household consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anatomically pre-contoured fracture fixation plates are a treatment option for bone fractures. A well-fitting plate can be used as a tool for anatomical reduction of the fractured bone. However, recent studies showed that some plates fit poorly for many patients due to considerable shape variations between bones of the same anatomical site. Therefore, the plates have to be manually fitted and deformed by surgeons to fit each patient optimally. The process is time-intensive and labor-intensive, and could lead to adverse clinical implications such as wound infection or plate failure. This paper proposes a new iterative method to simulate the patient-specific deformation of an optimally fitting plate for pre-operative planning purposes. We further demonstrate the validation of the method through a case study. The proposed method involves the integration of four commercially available software tools, Matlab, Rapidform2006, SolidWorks, and ANSYS, each performing specific tasks to obtain a plate shape that fits optimally for an individual tibia and is mechanically safe. A typical challenge when crossing multiple platforms is to ensure correct data transfer. We present an example of the implementation of the proposed method to demonstrate successful data transfer between the four platforms and the feasibility of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver (LBV). We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Utilities worldwide are focused on supplying peak electricity demand reliably and cost effectively, requiring a thorough understanding of all the factors influencing residential electricity use at peak times. An electricity demand reduction project based on comprehensive residential consumer engagement was established within an Australian community in 2008, and by 2011, peak demand had decreased to below pre-intervention levels. This paper applied field data discovered through qualitative in-depth interviews of 22 residential households at the community to a Bayesian Network complex system model to examine whether the system model could explain successful peak demand reduction in the case study location. The knowledge and understanding acquired through insights into the major influential factors and the potential impact of changes to these factors on peak demand would underpin demand reduction intervention strategies for a wider target group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a software architecture for real-world robotic applications. We discuss issues of software reliability, testing and realistic off-line simulation that allows the majority of the automation system to be tested off-line in the laboratory before deployment in the field. A recent project, the automation of a very large mining machine is used to illustrate the discussion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is seeking to investigate the effect of non-thermal plasma technology in the abatement of particulate matter (PM) from the actual diesel exhaust. Ozone (O3) strongly promotes PM oxidation, the main product of which is carbon dioxide (CO2). PM oxidation into the less harmful product (CO2) is the main objective whiles the correlation between PM, O3 and CO2 is considered. A dielectric barrier discharge reactor has been designed with pulsed power technology to produce plasma inside the diesel exhaust. To characterise the system under varied conditions, a range of applied voltages from 11 kVPP to 21kVPP at repetition rates of 2.5, 5, 7.5 and 10 kHz, have been experimentally investigated. The results show that by increasing the applied voltage and repetition rate, higher discharge power and CO2 dissociation can be achieved. The PM removal efficiency of more than 50% has been achieved during the experiments and high concentrations of ozone on the order of a few hundreds of ppm have been observed at high discharge powers. Furthermore, O3, CO2 and PM concentrations at different plasma states have been analysed for time dependence. Based on this analysis, an inverse relationship between ozone concentration and PM removal has been found and the role of ozone in PM removal in plasma treatment of diesel exhaust has been highlighted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2005, Ginger Myles and Hongxia Jin proposed a software watermarking scheme based on converting jump instructions or unconditional branch statements (UBSs) by calls to a fingerprint branch function (FBF) that computes the correct target address of the UBS as a function of the generated fingerprint and integrity check. If the program is tampered with, the fingerprint and integrity checks change and the target address will not be computed correctly. In this paper, we present an attack based on tracking stack pointer modifications to break the scheme and provide implementation details. The key element of the attack is to remove the fingerprint and integrity check generating code from the program after disassociating the target address from the fingerprint and integrity value. Using the debugging tools that give vast control to the attacker to track stack pointer operations, we perform both subtractive and watermark replacement attacks. The major steps in the attack are automated resulting in a fast and low-cost attack.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes the fabrication of nanostructured copper electrodes using a simple potential cycling protocol that involves oxidation and reduction of the surface in an alkaline solution. It was found that the inclusion of additives, such as benzyl alcohol and phenylacetic acid, has a profound effect on the surface oxidation process and the subsequent reduction of these oxides. This results in not only a morphology change, but also affects the electrocatalytic performance of the electrode for the reduction of nitrate ions. In all cases, the electrocatalytic performance of the restructured electrodes was significantly enhanced compared with the unmodified electrode. The most promising material was formed when phenylacetic acid was used as the additive. In addition, the reduction of residual oxides on the surface after the modification procedure to expose freshly active reaction sites on the surface before nitrate reduction was found to be a significant factor in dictating the overall electrocatalytic activity. It is envisaged that this approach offers an interesting way to fabricate other nanostructured electrode surfaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project developed a quantitative method for determining the quality of the surgical alignment of the bone fragments after an ankle fracture. The research examined the feasibility of utilising MRI-based bone models versus the gold standard CT-based bone models in order to reduce the amount of ionising radiation the patient is exposed to. In doing so, the thesis reports that there is potential for MRI to be used instead of CT depending on the scanning parameters used to obtain the medical images, the distance of the implant relative to the joint surface, and the implant material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient yet inexpensive electrocatalysts for oxygen reduction reaction (ORR) are an essential component of renewable energy devices, such as fuel cells and metal-air batteries. We herein interleaved novel Co3O4 nanosheets with graphene to develop a first ever sheet-on-sheet heterostructured electrocatalyst for ORR, whose electrocatalytic activity outperformed the state-of-the-art commercial Pt/C with exceptional durability in alkaline solution. The composite demonstrates the highest activity of all the nonprecious metal electrocatalysts, such as those derived from Co3O4 nanoparticle/nitrogen-doped graphene hybrids and carbon nanotube/nanoparticle composites. Density functional theory (DFT) calculations indicated that the outstanding performance originated from the significant charge transfer from graphene to Co3O4 nanosheets promoting the electron transport through the whole structure. Theoretical calculations revealed that the enhanced stability can be ascribed to the strong interaction generated between both types of sheets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of big data has already outperformed traditional data management efforts in almost all industries. Other instances it has succeeded in obtaining promising results that provide value from large-scale integration and analysis of heterogeneous data sources for example Genomic and proteomic information. Big data analytics have become increasingly important in describing the data sets and analytical techniques in software applications that are so large and complex due to its significant advantages including better business decisions, cost reduction and delivery of new product and services [1]. In a similar context, the health community has experienced not only more complex and large data content, but also information systems that contain a large number of data sources with interrelated and interconnected data attributes. That have resulted in challenging, and highly dynamic environments leading to creation of big data with its enumerate complexities, for instant sharing of information with the expected security requirements of stakeholders. When comparing big data analysis with other sectors, the health sector is still in its early stages. Key challenges include accommodating the volume, velocity and variety of healthcare data with the current deluge of exponential growth. Given the complexity of big data, it is understood that while data storage and accessibility are technically manageable, the implementation of Information Accountability measures to healthcare big data might be a practical solution in support of information security, privacy and traceability measures. Transparency is one important measure that can demonstrate integrity which is a vital factor in the healthcare service. Clarity about performance expectations is considered to be another Information Accountability measure which is necessary to avoid data ambiguity and controversy about interpretation and finally, liability [2]. According to current studies [3] Electronic Health Records (EHR) are key information resources for big data analysis and is also composed of varied co-created values [3]. Common healthcare information originates from and is used by different actors and groups that facilitate understanding of the relationship for other data sources. Consequently, healthcare services often serve as an integrated service bundle. Although a critical requirement in healthcare services and analytics, it is difficult to find a comprehensive set of guidelines to adopt EHR to fulfil the big data analysis requirements. Therefore as a remedy, this research work focus on a systematic approach containing comprehensive guidelines with the accurate data that must be provided to apply and evaluate big data analysis until the necessary decision making requirements are fulfilled to improve quality of healthcare services. Hence, we believe that this approach would subsequently improve quality of life.