931 resultados para Processing Graph
Resumo:
The inactivation kinetics of enzymes polyphenol oxidase (PPO) and peroxidase (POD) was studied for the batch (discontinuous) microwave treatment of green coconut water. Inactivation of commercial PPO and POD added to sterile coconut water was also investigated. The complete time-temperature profiles of the experimental runs were used for determination of the kinetic parameters D-value and z-value: PPO (D(92.20 degrees C) = 52 s and z = 17.6 degrees C); POD (D(92.92 degrees C) = 16 s and z = 11.5 degrees C); PPO/sterile coconut water: (D(84.45 degrees C) = 43 s and z = 39.5 degrees C) and POD/sterile coconut water: (D(86.54 degrees C) = 20 s and z = 19.3 degrees C). All data were well fitted by a first order kinetic model. The enzymes naturally present in coconut water showed a higher resistance when compared to those added to the sterilized medium or other simulated solutions reported in the literature. The thermal inactivation of PPO and POD during microwave processing of green coconut water was significantly faster in comparison with conventional processes reported in the literature. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The burning of organic residues and wastes in furnaces of cement industries has been an attractive and lucrative approach to eliminate stocks of these pollutants. There is a potential risk for producing PAH in the workplace of industries burning organic wastes, so that highly sensitive analytical methods are needed for monitoring the air quality of these environments. An official method for determination of PAH is based on liquid chromatography with fluorescence detection at fixed excitation and emission wavelengths. We demonstrate that a suitable choice of these wavelengths, which are changed during the chromatographic run, significantly improves the detectability of PAH in atmosphere and particulate matter collected in cement industries.
Resumo:
The Shwachman-Bodian-Diamond syndrome protein (SBDS) is a member of a highly conserved protein family of not well understood function, with putative orthologues found in different organisms ranging from Archaea, yeast and plants to vertebrate animals. The yeast orthologue of SBDS, Sdo1p, has been previously identified in association with the 60S ribosomal subunit and is proposed to participate in ribosomal recycling. Here we show that Sdo1p interacts with nucleolar rRNA processing factors and ribosomal proteins, indicating that it might bind the pre-60S complex and remain associated with it during processing and transport to the cytoplasm. Corroborating the protein interaction data, Sdo1p localizes to the nucleus and cytoplasm and co-immunoprecipitates precursors of 60S and 40S subunits, as well as the mature rRNAs. Sdo1p binds RNA directly, suggesting that it may associate with the ribosomal subunits also through RNA interaction. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Low-density polyethylene was filled with cellulose fibres from sugar cane bagasse obtained from organosolv/supercritical carbon dioxide pulping process. The fibres were also used after chemical modification with octadecanoyl and dodecanoyl chloride acids. The morphology, thermal properties, mechanical properties in both the linear and nonlinear range, and the water absorption behaviour of ensuing composites were tested. The evidence of occurrence of the chemical modification was checked by X-ray photoelectron spectrometry. The degree of polymerisation of the fibres and their intrinsic properties (zero tensile strength) were determined. It clearly appeared that the surface chemical modification of cellulose fibres resulted in improved interfacial adhesion with the matrix and higher dispersion level. However, composites did not show improved mechanical performances when compared to unmodified fibres. This surprising result was ascribed to the strong lowering of the degree of polymerisation of cellulose fibres (as confirmed by the drastic decrease of their zero tensile strength) after chemical treatment despite the mild conditions used. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
A student from the Data Processing program at the New York Trade School is shown working. Black and white photograph with some edge damage due to writing in black along the top.
Resumo:
Felice Gigante a graduate from the New York Trade School Electronics program works on a machine in his job as Data Processing Customer Engineer for the International Business Machines Corp. Original caption reads, "Felice Gigante - Electronices, International Business Machines Corp." Black and white photograph with caption glued to reverse.
Resumo:
This masters thesis describes the development of signal processing and patternrecognition in monitoring Parkison’s disease. It involves the development of a signalprocess algorithm and passing it into a pattern recogniton algorithm also. Thesealgorithms are used to determine , predict and make a conclusion on the study ofparkison’s disease. We get to understand the nature of how the parkinson’s disease isin humans.
Resumo:
The problem of scheduling a parallel program presented by a weighted directed acyclic graph (DAG) to the set of homogeneous processors for minimizing the completion time of the program has been extensively studied as academic optimization problem which occurs in optimizing the execution time of parallel algorithm with parallel computer.In this paper, we propose an application of the Ant Colony Optimization (ACO) to a multiprocessor scheduling problem (MPSP). In the MPSP, no preemption is allowed and each operation demands a setup time on the machines. The problem seeks to compose a schedule that minimizes the total completion time.We therefore rely on heuristics to find solutions since solution methods are not feasible for most problems as such. This novel heuristic searching approach to the multiprocessor based on the ACO algorithm a collection of agents cooperate to effectively explore the search space.A computational experiment is conducted on a suit of benchmark application. By comparing our algorithm result obtained to that of previous heuristic algorithm, it is evince that the ACO algorithm exhibits competitive performance with small error ratio.
Resumo:
In order to achieve the high performance, we need to have an efficient scheduling of a parallelprogram onto the processors in multiprocessor systems that minimizes the entire executiontime. This problem of multiprocessor scheduling can be stated as finding a schedule for ageneral task graph to be executed on a multiprocessor system so that the schedule length can be minimize [10]. This scheduling problem is known to be NP- Hard.In multi processor task scheduling, we have a number of CPU’s on which a number of tasksare to be scheduled that the program’s execution time is minimized. According to [10], thetasks scheduling problem is a key factor for a parallel multiprocessor system to gain betterperformance. A task can be partitioned into a group of subtasks and represented as a DAG(Directed Acyclic Graph), so the problem can be stated as finding a schedule for a DAG to beexecuted in a parallel multiprocessor system so that the schedule can be minimized. Thishelps to reduce processing time and increase processor utilization. The aim of this thesis workis to check and compare the results obtained by Bee Colony algorithm with already generatedbest known results in multi processor task scheduling domain.
Resumo:
The traveling salesman problem is although looking very simple problem but it is an important combinatorial problem. In this thesis I have tried to find the shortest distance tour in which each city is visited exactly one time and return to the starting city. I have tried to solve traveling salesman problem using multilevel graph partitioning approach.Although traveling salesman problem itself very difficult as this problem is belong to the NP-Complete problems but I have tried my best to solve this problem using multilevel graph partitioning it also belong to the NP-Complete problems. I have solved this thesis by using the k-mean partitioning algorithm which divides the problem into multiple partitions and solving each partition separately and its solution is used to improve the overall tour by applying Lin Kernighan algorithm on it. Through all this I got optimal solution which proofs that solving traveling salesman problem through graph partition scheme is good for this NP-Problem and through this we can solved this intractable problem within few minutes.Keywords: Graph Partitioning Scheme, Traveling Salesman Problem.
Resumo:
The problems of finding best facility locations require complete and accurate road network with the corresponding population data in a specific area. However the data obtained in road network databases usually do not fit in this usage. In this paper we propose our procedure of converting the road network database to a road graph which could be used in localization problems. The road network data come from the National road data base in Sweden. The graph derived is cleaned, and reduced to a suitable level for localization problems. The population points are also processed in ordered to match with that graph. The reduction of the graph is done maintaining most of the accuracy for distance measures in the network.
Resumo:
GPS technology has been embedded into portable, low-cost electronic devices nowadays to track the movements of mobile objects. This implication has greatly impacted the transportation field by creating a novel and rich source of traffic data on the road network. Although the promise offered by GPS devices to overcome problems like underreporting, respondent fatigue, inaccuracies and other human errors in data collection is significant; the technology is still relatively new that it raises many issues for potential users. These issues tend to revolve around the following areas: reliability, data processing and the related application. This thesis aims to study the GPS tracking form the methodological, technical and practical aspects. It first evaluates the reliability of GPS based traffic data based on data from an experiment containing three different traffic modes (car, bike and bus) traveling along the road network. It then outline the general procedure for processing GPS tracking data and discuss related issues that are uncovered by using real-world GPS tracking data of 316 cars. Thirdly, it investigates the influence of road network density in finding optimal location for enhancing travel efficiency and decreasing travel cost. The results show that the geographical positioning is reliable. Velocity is slightly underestimated, whereas altitude measurements are unreliable.Post processing techniques with auxiliary information is found necessary and important when solving the inaccuracy of GPS data. The densities of the road network influence the finding of optimal locations. The influence will stabilize at a certain level and do not deteriorate when the node density is higher.
Resumo:
The advancement of GPS technology enables GPS devices not only to be used as orientation and navigation tools, but also to track travelled routes. GPS tracking data provides essential information for a broad range of urban planning applications such as transportation routing and planning, traffic management and environmental control. This paper describes on processing the data that was collected by tracking the cars of 316 volunteers over a seven-week period. The detailed information is extracted. The processed data is further connected to the underlying road network by means of maps. Geographical maps are applied to check how the car-movements match the road network. The maps capture the complexity of the car-movements in the urban area. The results show that 90% of the trips on the plane match the road network within a tolerance.