912 resultados para Simulator of Performance in Error


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For some new applications of metals in functional devices, metals of high purity are required. In recent years, many high-purity metals have been produced commercially for use in electronics, but the demand for ultra-high-purity metals is increasing rapidly because of more stringent specifications for materials used in high-performance information devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infrared spectra of atmospherically and astronomically important dimethylphenanthrenes (DMPs), namely 1,9-DMP, 2,4-DMP, and 3,9-DMP, were recorded in the gas phase from 400 to 4000 cm(-1) with a resolution of 0.5 cm(-1) at 110 degrees C using a 7.2 m gas cell. DFT calculations at the B3LYP/6-311G** level were carried out to get the harmonic and anharmonic frequencies and their corresponding intensities for the assignment of the observed bands. However, spectral assignments could not be made unambiguously using anharmonic or selectively scaled harmonic frequencies. Therefore, the scaled quantum mechanical (SQM) force field analysis method was adopted to achieve more accurate assignments. In this method force fields instead of frequencies were scaled. The Cartesian force field matrix obtained from the Gaussian calculations was converted to a nonredundant local coordinate force field matrix and then the force fields were scaled to match experimental frequencies in a consistent manner using a modified version of the UMAT program of the QCPE package. Potential energy distributions (PEDs) of the normal modes in terms of nonredundant local coordinates obtained from these calculations helped us derive the nature of the vibration at each frequency. The intensity of observed bands in the experimental spectra was calculated using estimated vapor pressures of the DMPs. An error analysis of the mean deviation between experimental and calculated intensities reveal that the observed methyl C-H stretching intensity deviates more compared to the aromatic C-H and non C-H stretching bands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we estimate the solution of the electromigration diffusion equation (EMDE) in isotopically pure and impure metallic single-walled carbon nanotubes (CNTs) (SWCNTs) by considering self-heating. The EMDE for SWCNT has been solved not only by invoking the dependence of the electromigration flux on the usual applied static electric field across its two ends but also by considering a temperature-dependent thermal conductivity (κ) which results in a variable temperature distribution (T) along its length due to self-heating. By changing its length and isotopic impurity, we demonstrate that there occurs a significant deviation in the SWCNT electromigration performance. However, if κ is assumed to be temperature independent, the solution may lead to serious errors in performance estimation. We further exhibit a tradeoff between length and impurity effect on the performance toward electromigration. It is suggested that, to reduce the vacancy concentration in longer interconnects of few micrometers, one should opt for an isotopically impure SWCNT at the cost of lower κ, whereas for comparatively short interconnects, pure SWCNT should be used. This tradeoff presented here can be treated as a way for obtaining a fairly well estimation of the vacancy concentration and mean time to failure in the bundles of CNT-based interconnects. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data Prefetchers identify and make use of any regularity present in the history/training stream to predict future references and prefetch them into the cache. The training information used is typically the primary misses seen at a particular cache level, which is a filtered version of the accesses seen by the cache. In this work we demonstrate that extending the training information to include secondary misses and hits along with primary misses helps improve the performance of prefetchers. In addition to empirical evaluation, we use the information theoretic metric entropy, to quantify the regularity present in extended histories. Entropy measurements indicate that extended histories are more regular than the default primary miss only training stream. Entropy measurements also help corroborate our empirical findings. With extended histories, further benefits can be achieved by triggering prefetches during secondary misses also. In this paper we explore the design space of extended prefetch histories and alternative prefetch trigger points for delta correlation prefetchers. We observe that different prefetch schemes benefit to a different extent with extended histories and alternative trigger points. Also the best performing design point varies on a per-benchmark basis. To meet these requirements, we propose a simple adaptive scheme that identifies the best performing design point for a benchmark-prefetcher combination at runtime. In SPEC2000 benchmarks, using all the L2 accesses as history for prefetcher improves the performance in terms of both IPC and misses reduced over techniques that use only primary misses as history. The adaptive scheme improves the performance of CZone prefetcher over Baseline by 4.6% on an average. These performance gains are accompanied by a moderate reduction in the memory traffic requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper optical code-division multiple-access (O-CDMA) packet network is considered, which offers inherent security in the access networks. Two types of random access protocols are proposed for packet transmission. In protocol 1, all distinct codes and in protocol 2, distinct codes as well as shifted versions of all these codes are used. O-CDMA network performance using optical orthogonal codes (OOCs) 1-D and two-dimensional (2-D) wavelength/time single-pulse-per-row (W/T SPR) codes are analyzed. The main advantage of using 2-D codes instead of one-dimensional (1-D) codes is to reduce the errors due to multiple access interference among different users. In this paper, correlation receiver and chip-level receiver are considered in the analysis. Using analytical model, we compute packet-success probability, throughput and compare for OOC and SPR codes in an O-CDMA network and the analysis shows improved performance with SPR codes as compared to OOC codes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Primates exhibit laterality in hand usage either in terms of (a) hand with which an individual solves a task or while solving a task that requires both hands, executes the most complex action, that is, hand preference, or (b) hand with which an individual executes actions most efficiently, that is, hand performance. Observations from previous studies indicate that laterality in hand usage might reflect specialization of the two hands for accomplishing tasks that require maneuvering dexterity or physical strength. However, no existing study has investigated handedness with regard to this possibility. In this study, we examined laterality in hand usage in urban free-ranging bonnet macaques, Macaca radiata with regard to the above possibility. While solving four distinct food extraction tasks which varied in the number of steps involved in the food extraction process and the dexterity required in executing the individual steps, the macaques consistently used one hand for extracting food (i.e., task requiring maneuvering dexterity)the maneuvering hand, and the other hand for supporting the body (i.e., task requiring physical strength)the supporting hand. Analogously, the macaques used the maneuvering hand for the spontaneous routine activities that involved maneuvering in three-dimensional space, such as grooming, and hitting an opponent during an agonistic interaction, and the supporting hand for those that required physical strength, such as pulling the body up while climbing. Moreover, while solving a task that ergonomically forced the usage of a particular hand, the macaques extracted food faster with the maneuvering hand as compared to the supporting hand, demonstrating the higher maneuvering dexterity of the maneuvering hand. As opposed to the conventional ideas of handedness in non-human primates, these observations demonstrate division of labor between the two hands marked by their consistent usage across spontaneous and experimental tasks requiring maneuvering in three-dimensional space or those requiring physical strength. Am. J. Primatol. 76:576-585, 2014. (c) 2013 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design methodology for flexible pavements needs to address the mechanisms of pavement failure, loading intensities and also develop suitable approaches for evaluation of pavement performance. In the recent years, the use of geocells to improve pavement performance has been receiving considerable attention. This paper studies the influence of geocells on the required thickness of pavements by placing it below the granular layers (base and sub-base) and above the subgrade. The reduction in thickness here refers to the reduction in the thickness of the GSB (Granular Sub-base) layer, with a possibility of altogether getting rid of it. To facilitate the analysis, a simple linear elastic approach is used, considering six of the sections as given in the Indian Roads Congress (IRC) code. All the analysis was done using the pavement analysis package KENPAVE. The results show that the use of geocells enables a reduction in pavement thickness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the experience of the new design of using impinging jet spray columns for scrubbing hydrogen sulfide from biogas that has been developed by Indian Institute of Science and patented. The process uses a chelated polyvalent metal ion which oxidizes the hydrogen sulfide to sulfur as a precipitate. The sulfur generated is filtered and the scrubbing liquid recycled after oxidation. The process involves in bringing contact the sour gas with chelated liquid in the spray columns where H2S reacts with chelated Fe3+ and precipitates as sulfur, whereas Fe3+ gets reduced to Fe2+. Fe2+ is regenerated to Fe3+ by reaction of oxygen in air in a separate packed column. The regenerated liquid is recirculated. Sulfur is filtered and separated as a byproduct. The paper presents the experience in using the spray towers for hydrogen sulfide removal and further use of the clean gas for generating power using gas engines. The maximum allowable limit of H2S for the gas engine is 200 ppm (v/v) in order to prevent any corrosion of engine parts and fouling of the lubricating oil. With the current ISET process, the hydrogen sulfide from the biogas is cleaned to less than 100 ppm (v/v) and the sweet gas is used for power generation. The system is designed for 550 NM3/hr of biogas and inlet H2S concentration of 2.5 %. The inlet concentration of the H2S is about 1 - 1.5 % and average measured outlet concentration is about 30 ppm, with an average gas flow of about 300 - 350 NM3/hr, which is the current gas production rate. The sweet gas is used for power generation in a 1.2 MWe V 12 engine. The average power generation is about 650 - 750 kWe, which is the captive load of the industry. The plant is a CHP (combined heat power) unit with heat from the cylinder cooling and flue being recovered for hot water and steam generation respectively. The specific fuel consumption is 2.29 kWh/m(3) of gas. The system has been in operation for more than 13,000 hours in last one year in the industry. About 8.4 million units of electricity has been generated scrubbing about 2.1 million m3 of gas. Performance of the scrubber and the engine is discussed at daily performance level and also the overall performance with an environment sustenance by precipitating over 27 tons of sulfur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple yet remarkable, electrochemically activated carbon paste electrode (EACPE) was prepared by successive potential cycling of carbon paste in a 0.1 M NaOH solution and was effectively used for the simultaneous determination of catecholamines such as dopamine (DA), epinephrine (E) and Norepinephrine (NE) in presence of uric acid (UA) and ascorbic acid (AA). Taking DA as the ideal catecholamine, the electrochemical behaviors of DA, UA and AA such as scan rate and pH variation was studied by cyclic voltammetry (CV) in phosphate buffer solution (PBS, pH 7.1). This electrochemical sensor exhibited strong electrocatalytic activity towards the oxidation of a mixture of catecholamines, UA and AA with apparent reduction of overpotentials. Crider optimum conditions, limit of detection (S/N = 3) of DA, E, NE, UA and AA was found to be 0.08, 0.08, 0.07, 0.1 and 6.0 mu M, respectively by differential pulse voltammetry (DPV). The analytical performance of this modified electrode as a biosensor was also demonstrated for the determination of DA, UA and AA in dopamine injection, human urine and vitamin C tablets, respectively, in presence of other interfering substances. (C) 2015 The Electrochemical Society. All-rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tracking systems, that continually orient photovoltaic (PV) panels towards the Sun, are expected to increase the power output from the PV panels. Tremendous amount of research is being done and funds are being spent in order to increase the efficiency of PV cells to generate more power. We report the performance of two almost identical PV systems; one at a fixed latitude tilt and the other on a two-axis tracker. We observed that the fixed axis PV panels generated 336.3 kWh, and the dual-axis Sun-tracked PV panels generated 407.2 kWh during August 2012 March 2013. The tracked panels generated 21.2% more electricity than the optimum tilt angle fixed-axis panels. The cost payback calculations indicate that the additional cost of the tracker can be recovered in 450 days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed system has quite a lot of servers to attain increased availability of service and for fault tolerance. Balancing the load among these servers is an important task to achieve better performance. There are various hardware and software based load balancing solutions available. However there is always an overhead on Servers and the Load Balancer while communicating with each other and sharing their availability and the current load status information. Load balancer is always busy in listening to clients' request and redirecting them. It also needs to collect the servers' availability status frequently, to keep itself up-to-date. Servers are busy in not only providing service to clients but also sharing their current load information with load balancing algorithms. In this paper we have proposed and discussed the concept and system model for software based load balancer along with Availability-Checker and Load Reporters (LB-ACLRs) which reduces the overhead on server and the load balancer. We have also described the architectural components with their roles and responsibilities. We have presented a detailed analysis to show how our proposed Availability Checker significantly increases the performance of the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Imaging flow cytometry is an emerging technology that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy. It allows high-throughput imaging of cells with good spatial resolution, while they are in flow. This paper proposes a general framework for the processing/classification of cells imaged using imaging flow cytometer. Each cell is localized by finding an accurate cell contour. Then, features reflecting cell size, circularity and complexity are extracted for the classification using SVM. Unlike the conventional iterative, semi-automatic segmentation algorithms such as active contour, we propose a noniterative, fully automatic graph-based cell localization. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using custom fabricated cost-effective microfluidics-based imaging flow cytometer. The proposed system is a significant development in the direction of building a cost-effective cell analysis platform that would facilitate affordable mass screening camps looking cellular morphology for disease diagnosis. Lay description In this article, we propose a novel framework for processing the raw data generated using microfluidics based imaging flow cytometers. Microfluidics microscopy or microfluidics based imaging flow cytometry (mIFC) is a recent microscopy paradigm, that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy, which allows us imaging cells while they are in flow. In comparison to the conventional slide-based imaging systems, mIFC is a nascent technology enabling high throughput imaging of cells and is yet to take the form of a clinical diagnostic tool. The proposed framework process the raw data generated by the mIFC systems. The framework incorporates several steps: beginning from pre-processing of the raw video frames to enhance the contents of the cell, localising the cell by a novel, fully automatic, non-iterative graph based algorithm, extraction of different quantitative morphological parameters and subsequent classification of cells. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using cost-effective microfluidics based imaging flow cytometer. The cell lines of HL60, K562 and MOLT were obtained from ATCC (American Type Culture Collection) and are separately cultured in the lab. Thus, each culture contains cells from its own category alone and thereby provides the ground truth. Each cell is localised by finding a closed cell contour by defining a directed, weighted graph from the Canny edge images of the cell such that the closed contour lies along the shortest weighted path surrounding the centroid of the cell from a starting point on a good curve segment to an immediate endpoint. Once the cell is localised, morphological features reflecting size, shape and complexity of the cells are extracted and used to develop a support vector machine based classification system. We could classify the cell-lines with good accuracy and the results were quite consistent across different cross validation experiments. We hope that imaging flow cytometers equipped with the proposed framework for image processing would enable cost-effective, automated and reliable disease screening in over-loaded facilities, which cannot afford to hire skilled personnel in large numbers. Such platforms would potentially facilitate screening camps in low income group countries; thereby transforming the current health care paradigms by enabling rapid, automated diagnosis for diseases like cancer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report presents an overview of the state of aquatic resources in the Philippines, its performance and importance in the Philippine economy, and explores the situation of poverty in the "aquatic resources sector." The report describes the policy environment that guides the action of key actors in the sector. The report also provides a general analysis of some trends in relation to factors that keep the poor from participating and benefiting from aquatic resource management, based on the perspectives of the authors. (PDF contains 135 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we develop an efficient collapse prediction model, the PFA (Peak Filtered Acceleration) model, for buildings subjected to different types of ground motions.

For the structural system, the PFA model covers modern steel and reinforced concrete moment-resisting frame buildings (potentially reinforced concrete shear wall buildings). For ground motions, the PFA model covers ramp-pulse-like ground motions, long-period ground motions, and short-period ground motions.

To predict whether a building will collapse in response to a given ground motion, we first extract long-period components from the ground motion using a Butterworth low-pass filter with suggested order and cutoff frequency. The order depends on the type of ground motion, and the cutoff frequency depends on the building’s natural frequency and ductility. We then compare the filtered acceleration time history with the capacity of the building. The capacity of the building is a constant for 2-dimentional buildings and a limit domain for 3-dimentional buildings. If the filtered acceleration exceeds the building’s capacity, the building is predicted to collapse. Otherwise, it is expected to survive the ground motion.

The parameters used in PFA model, which include fundamental period, global ductility and lateral capacity, can be obtained either from numerical analysis or interpolation based on the reference building system proposed in this thesis.

The PFA collapse prediction model greatly reduces computational complexity while archiving good accuracy. It is verified by FEM simulations of 13 frame building models and 150 ground motion records.

Based on the developed collapse prediction model, we propose to use PFA (Peak Filtered Acceleration) as a new ground motion intensity measure for collapse prediction. We compare PFA with traditional intensity measures PGA, PGV, PGD, and Sa in collapse prediction and find that PFA has the best performance among all the intensity measures.

We also provide a close form in term of a vector intensity measure (PGV, PGD) of the PFA collapse prediction model for practical collapse risk assessment.