989 resultados para ink reduction software
Resumo:
Anatomically pre-contoured fracture fixation plates are a treatment option for bone fractures. A well-fitting plate can be used as a tool for anatomical reduction of the fractured bone. However, recent studies showed that some plates fit poorly for many patients due to considerable shape variations between bones of the same anatomical site. Therefore, the plates have to be manually fitted and deformed by surgeons to fit each patient optimally. The process is time-intensive and labor-intensive, and could lead to adverse clinical implications such as wound infection or plate failure. This paper proposes a new iterative method to simulate the patient-specific deformation of an optimally fitting plate for pre-operative planning purposes. We further demonstrate the validation of the method through a case study. The proposed method involves the integration of four commercially available software tools, Matlab, Rapidform2006, SolidWorks, and ANSYS, each performing specific tasks to obtain a plate shape that fits optimally for an individual tibia and is mechanically safe. A typical challenge when crossing multiple platforms is to ensure correct data transfer. We present an example of the implementation of the proposed method to demonstrate successful data transfer between the four platforms and the feasibility of the method.
Resumo:
The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver (LBV). We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.
Resumo:
Utilities worldwide are focused on supplying peak electricity demand reliably and cost effectively, requiring a thorough understanding of all the factors influencing residential electricity use at peak times. An electricity demand reduction project based on comprehensive residential consumer engagement was established within an Australian community in 2008, and by 2011, peak demand had decreased to below pre-intervention levels. This paper applied field data discovered through qualitative in-depth interviews of 22 residential households at the community to a Bayesian Network complex system model to examine whether the system model could explain successful peak demand reduction in the case study location. The knowledge and understanding acquired through insights into the major influential factors and the potential impact of changes to these factors on peak demand would underpin demand reduction intervention strategies for a wider target group.
Resumo:
This paper describes a software architecture for real-world robotic applications. We discuss issues of software reliability, testing and realistic off-line simulation that allows the majority of the automation system to be tested off-line in the laboratory before deployment in the field. A recent project, the automation of a very large mining machine is used to illustrate the discussion.
Resumo:
This study is seeking to investigate the effect of non-thermal plasma technology in the abatement of particulate matter (PM) from the actual diesel exhaust. Ozone (O3) strongly promotes PM oxidation, the main product of which is carbon dioxide (CO2). PM oxidation into the less harmful product (CO2) is the main objective whiles the correlation between PM, O3 and CO2 is considered. A dielectric barrier discharge reactor has been designed with pulsed power technology to produce plasma inside the diesel exhaust. To characterise the system under varied conditions, a range of applied voltages from 11 kVPP to 21kVPP at repetition rates of 2.5, 5, 7.5 and 10 kHz, have been experimentally investigated. The results show that by increasing the applied voltage and repetition rate, higher discharge power and CO2 dissociation can be achieved. The PM removal efficiency of more than 50% has been achieved during the experiments and high concentrations of ozone on the order of a few hundreds of ppm have been observed at high discharge powers. Furthermore, O3, CO2 and PM concentrations at different plasma states have been analysed for time dependence. Based on this analysis, an inverse relationship between ozone concentration and PM removal has been found and the role of ozone in PM removal in plasma treatment of diesel exhaust has been highlighted.
Resumo:
In 2005, Ginger Myles and Hongxia Jin proposed a software watermarking scheme based on converting jump instructions or unconditional branch statements (UBSs) by calls to a fingerprint branch function (FBF) that computes the correct target address of the UBS as a function of the generated fingerprint and integrity check. If the program is tampered with, the fingerprint and integrity checks change and the target address will not be computed correctly. In this paper, we present an attack based on tracking stack pointer modifications to break the scheme and provide implementation details. The key element of the attack is to remove the fingerprint and integrity check generating code from the program after disassociating the target address from the fingerprint and integrity value. Using the debugging tools that give vast control to the attacker to track stack pointer operations, we perform both subtractive and watermark replacement attacks. The major steps in the attack are automated resulting in a fast and low-cost attack.
Resumo:
This work describes the fabrication of nanostructured copper electrodes using a simple potential cycling protocol that involves oxidation and reduction of the surface in an alkaline solution. It was found that the inclusion of additives, such as benzyl alcohol and phenylacetic acid, has a profound effect on the surface oxidation process and the subsequent reduction of these oxides. This results in not only a morphology change, but also affects the electrocatalytic performance of the electrode for the reduction of nitrate ions. In all cases, the electrocatalytic performance of the restructured electrodes was significantly enhanced compared with the unmodified electrode. The most promising material was formed when phenylacetic acid was used as the additive. In addition, the reduction of residual oxides on the surface after the modification procedure to expose freshly active reaction sites on the surface before nitrate reduction was found to be a significant factor in dictating the overall electrocatalytic activity. It is envisaged that this approach offers an interesting way to fabricate other nanostructured electrode surfaces.
Resumo:
This project developed a quantitative method for determining the quality of the surgical alignment of the bone fragments after an ankle fracture. The research examined the feasibility of utilising MRI-based bone models versus the gold standard CT-based bone models in order to reduce the amount of ionising radiation the patient is exposed to. In doing so, the thesis reports that there is potential for MRI to be used instead of CT depending on the scanning parameters used to obtain the medical images, the distance of the implant relative to the joint surface, and the implant material.
Resumo:
Efficient yet inexpensive electrocatalysts for oxygen reduction reaction (ORR) are an essential component of renewable energy devices, such as fuel cells and metal-air batteries. We herein interleaved novel Co3O4 nanosheets with graphene to develop a first ever sheet-on-sheet heterostructured electrocatalyst for ORR, whose electrocatalytic activity outperformed the state-of-the-art commercial Pt/C with exceptional durability in alkaline solution. The composite demonstrates the highest activity of all the nonprecious metal electrocatalysts, such as those derived from Co3O4 nanoparticle/nitrogen-doped graphene hybrids and carbon nanotube/nanoparticle composites. Density functional theory (DFT) calculations indicated that the outstanding performance originated from the significant charge transfer from graphene to Co3O4 nanosheets promoting the electron transport through the whole structure. Theoretical calculations revealed that the enhanced stability can be ascribed to the strong interaction generated between both types of sheets.
Resumo:
The concept of big data has already outperformed traditional data management efforts in almost all industries. Other instances it has succeeded in obtaining promising results that provide value from large-scale integration and analysis of heterogeneous data sources for example Genomic and proteomic information. Big data analytics have become increasingly important in describing the data sets and analytical techniques in software applications that are so large and complex due to its significant advantages including better business decisions, cost reduction and delivery of new product and services [1]. In a similar context, the health community has experienced not only more complex and large data content, but also information systems that contain a large number of data sources with interrelated and interconnected data attributes. That have resulted in challenging, and highly dynamic environments leading to creation of big data with its enumerate complexities, for instant sharing of information with the expected security requirements of stakeholders. When comparing big data analysis with other sectors, the health sector is still in its early stages. Key challenges include accommodating the volume, velocity and variety of healthcare data with the current deluge of exponential growth. Given the complexity of big data, it is understood that while data storage and accessibility are technically manageable, the implementation of Information Accountability measures to healthcare big data might be a practical solution in support of information security, privacy and traceability measures. Transparency is one important measure that can demonstrate integrity which is a vital factor in the healthcare service. Clarity about performance expectations is considered to be another Information Accountability measure which is necessary to avoid data ambiguity and controversy about interpretation and finally, liability [2]. According to current studies [3] Electronic Health Records (EHR) are key information resources for big data analysis and is also composed of varied co-created values [3]. Common healthcare information originates from and is used by different actors and groups that facilitate understanding of the relationship for other data sources. Consequently, healthcare services often serve as an integrated service bundle. Although a critical requirement in healthcare services and analytics, it is difficult to find a comprehensive set of guidelines to adopt EHR to fulfil the big data analysis requirements. Therefore as a remedy, this research work focus on a systematic approach containing comprehensive guidelines with the accurate data that must be provided to apply and evaluate big data analysis until the necessary decision making requirements are fulfilled to improve quality of healthcare services. Hence, we believe that this approach would subsequently improve quality of life.
Resumo:
The reduction of meso-formyl derivatives of 5,15-diaryl- and 5,10,15-triphenylporphyrin (and their nickel(II) complexes) to the corresponding meso-methyl porphyrins is achieved in high yield by microwave heating of the substrate in dimethylformamide (DMF) in the presence of acids such as trifluoroacetic acid, or even just with added water. The reactions are complete in less than 30 min at 250 °C. The reaction is strongly suppressed in very dry DMF in the absence of added acid. The meso-hydroxymethyl porphyrins are also reduced to the methyl derivatives, suggesting the primary alcohols may be intermediates in the exhaustive reduction. UV-visible spectra taken at intervals during reaction at 240 °C indicated that at least one other intermediate is present, but it was not identified. In d7-DMF, the methylporphyrin isolated was mainly Por-CD2H, showing that both of the added hydrogens arise from the solvent, and not from the added water or acid.
Resumo:
Though increased particulate air pollution has been consistently associated with elevated mortality, evidence regarding whether diminished particulate air pollution would lead to mortality reduction is limited. Citywide air pollution mitigation program during the 2010 Asian Games in Guangzhou, China, provided such an opportunity. Daily mortality from non-accidental, cardiovascular and respiratory diseases was compared for 51 intervention days (November 1–December 21) in 2010 with the same calendar date of baseline years (2006–2009 and 2011). Relative risk (RR) and 95% confidence interval (95% CI) were estimated using a time series Poisson model, adjusting for day of week, public holidays, daily mean temperature and relative humidity. Daily PM10 (particle with aerodynamic diameter less than 10 μm) decreased from 88.64 μg/m3 during the baseline period to 80.61 μg/m3 during the Asian Games period. Other measured air pollutants and weather variables did not differ substantially. Daily mortality from non-accidental, cardiovascular and respiratory diseases decreased from 32, 11 and 6 during the baseline period to 25, 8 and 5 during the Games period, the corresponding RR for the Games period compared with the baseline period was 0.79 (95% CI: 0.73–0.86), 0.77 (95% CI: 0.66–0.89) and 0.68 (95% CI: 0.57–0.80), respectively. No significant decreases were observed in other months of 2010 in Guangzhou and intervention period in two control cities. This finding supports the efforts to reduce air pollution and improve public health through transportation restriction and industrial emission control.
Resumo:
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.
Resumo:
We investigate the terminating concept of BKZ reduction first introduced by Hanrot et al. [Crypto'11] and make extensive experiments to predict the number of tours necessary to obtain the best possible trade off between reduction time and quality. Then, we improve Buchmann and Lindner's result [Indocrypt'09] to find sub-lattice collision in SWIFFT. We illustrate that further improvement in time is possible through special setting of SWIFFT parameters and also through the combination of different reduction parameters adaptively. Our contribution also include a probabilistic simulation approach top-up deterministic simulation described by Chen and Nguyen [Asiacrypt'11] that can able to predict the Gram-Schmidt norms more accurately for large block sizes.