861 resultados para Ecosystem management -- Queensland -- Johnstone (Shire) -- Data processing.
Resumo:
The materials management function is always a major concern to management of any industrial organization, since high inventory and an inefficient procurement process significantly affect profitability. Problems multiply due to the current dynamic business environment in many countries. Hence, existing materials planning and procurement process and inventory management systems require a review. This article shows a radical improvement in the materials management function for an Indian petroleum refinery through business process re-engineering (BPR) by analyzing the current process, identifying key issues, deriving paradigm shifts and developing re-engineered processes through customer value analysis. BPR has been carried out on the existing processes of "material planning and procurement" and "warehousing and surplus disposal.
Resumo:
Conventional project management techniques are not always sufficient to ensure that schedule, cost and quality goals are met on large-scale construction projects. These jobs require complex planning, designing and implementation processes. The main reasons for a project's nonachievement in India's hydrocarbon processing industry are changes in scope and design, altered government policies and regulations, unforeseen inflation, under and/or improper estimation. Projects that are exposed to such an uncertain environment can be effectively managed by applying risk management throughout the project life cycle.
Resumo:
Very often the experimental data are the realization of the process, fully determined by some unknown function, being distorted by hindrances. Treatment and experimental data analysis are substantially facilitated, if these data to represent as analytical expression. The experimental data processing algorithm and the example of using this algorithm for spectrographic analysis of oncologic preparations of blood is represented in this article.
Resumo:
The authors analyse some of the research outcomes achieved during the implementation of the EC GUIDE research project “Creating an European Identity Management Architecture for eGovernment”, as well as their personal experience. The project goals and achievements are however considered in a broader context. The key role of Identity in the Information Society was emphasised, that the research and development in this field is in its initial phase. The scope of research related to Identity, including the one related to Identity Management and Interoperability of Identity Management Systems, is expected to be further extended. The authors analyse the abovementioned issues in the context established by the EC European Interoperability Framework (EIF) as a reference document on interoperability for the Interoperable Delivery of European eGovernment Services to Public Administrations, Business and Citizens (IDABC) Work Programme. This programme aims at supporting the pan-European delivery of electronic government services.
Resumo:
During the MEMORIAL project time an international consortium has developed a software solution called DDW (Digital Document Workbench). It provides a set of tools to support the process of digitisation of documents from the scanning up to the retrievable presentation of the content. The attention is focused to machine typed archival documents. One of the important features is the evaluation of quality in each step of the process. The workbench consists of automatic parts as well as of parts which request human activity. The measurable improvement of 20% shows the approach is successful.
Resumo:
This chapter provides the theoretical foundation and background on Data Envelopment Analysis (DEA) method and some variants of basic DEA models and applications to various sectors. Some illustrative examples, helpful resources on DEA, including DEA software package, are also presented in this chapter. DEA is useful for measuring relative efficiency for variety of institutions and has its own merits and limitations. This chapter concludes that DEA results should be interpreted with much caution to avoid giving wrong signals and providing inappropriate recommendations.
Resumo:
This paper highlights the challenges of satellite monitoring systems integration, in particular based on Grid platform, and reviews possible solutions for these problems. We describe integration issues on different levels: data integration level and task management level (job submission in terms of Grid). We show example of described technologies for integration of monitoring systems of Ukraine (National Space Agency of Ukraine, NASU) and Russia (Space Research Institute RAS, IKI RAN). Another example refers to the development of InterGrid infrastructure that integrates several regional and national Grid systems: Ukrainian Academician Grid (with Satellite data processing Grid segment) and RSGS Grid (Chinese Academy of Sciences).
Resumo:
The purpose of this paper is to investigate the technological development of electronic inventory solutions from perspective of patent analysis. We first applied the international patent classification to classify the top categories of data processing technologies and their corresponding top patenting countries. Then we identified the core technologies by the calculation of patent citation strength and standard deviation criterion for each patent. To eliminate those core innovations having no reference relationships with the other core patents, relevance strengths between core technologies were evaluated also. Our findings provide market intelligence not only for the research and development community, but for the decision making of advanced inventory solutions.
Resumo:
The development of new all-optical technologies for data processing and signal manipulation is a field of growing importance with a strong potential for numerous applications in diverse areas of modern science. Nonlinear phenomena occurring in optical fibres have many attractive features and great, but not yet fully explored, potential in signal processing. Here, we review recent progress on the use of fibre nonlinearities for the generation and shaping of optical pulses and on the applications of advanced pulse shapes in all-optical signal processing. Amongst other topics, we will discuss ultrahigh repetition rate pulse sources, the generation of parabolic shaped pulses in active and passive fibres, the generation of pulses with triangular temporal profiles, and coherent supercontinuum sources. The signal processing applications will span optical regeneration, linear distortion compensation, optical decision at the receiver in optical communication systems, spectral and temporal signal doubling, and frequency conversion. © Copyright 2012 Sonia Boscolo and Christophe Finot.
Resumo:
This chapter provides information on the use of Performance Improvement Management Software (PIMDEA). This advanced DEA software enables users to make the best possible analysis of the data, using the latest theoretical developments in Data Envelopment Analysis (DEA). PIM-DEA software gives full capacity to assess efficiency and productivity, set targets, identify benchmarks, and much more, allowing users to truly manage the performance of organizational units. PIM-DEA is easy to use and powerful, and it has an extensive range of the most up-to-date DEA models and which can handle large sets of data.
Resumo:
The approach of all ophthalmologists, diabetologists and general practitioners seeing patients with diabetic retinopathy should be that good control of blood glucose, blood pressure and plasma lipids are all essential components of modern medical management. The more recent data on the use of fenofibrate in the Fenofibrate Intervention and Event Lowering in Diabetes (FIELD) and The Action to Control Cardiovascular Risk in Diabetes (ACCORD) Eye studies is reviewed. In FIELD, fenofibrate (200 mg/day) reduced the requirements for laser therapy and prevented disease progression in patients with pre-existing diabetic retinopathy. In ACCORD Eye, fenofibrate (160 mg daily) with simvastatin resulted in a 40% reduction in the odds of retinopathy progressing over 4 years, compared with simvastatin alone. This occurred with an increase in HDL-cholesterol and a decrease in the serum triglyceride level in the fenofibrate group, as compared with the placebo group, and was independent of glycaemic control. We believe fenofibrate is effective in preventing progression of established diabetic retinopathy in type 2 diabetes and should be considered for patients with pre-proliferative diabetic retinopathy and/or diabetic maculopathy, particularly in those with macular oedema requiring laser. © 2011 Macmillan Publishers Limited All rights reserved.
Resumo:
Néhány éve vonult be a köztudatba a cloud computing fogalom, mely ma már a szakirodalomban és az informatikai alkalmazásokban is egyre nagyobb teret foglal el. Ez az új IT-technológia a számítási felhő számítástechnikai szolgáltatásaihoz kapcsolódó ERP-rendszerek szabványosítását, elterjedését eredményezi. A szerzők cikkükben áttekintést adnak a cloud computing mai helyzetéről és a számítási felhőben működő adatfeldolgozó rendszerekkel (kiemelten ERP) kapcsolatos felhasználói elvárásokról, illetve kezdeti, németországi alkalmazási tapasztalatokról. Külön tárgyalják az ERP-rendszerek új kiválasztási céljait és kritériumait, melyek a felhőkörnyezet speciális lehetőségei miatt alakultak ki. _____ The concept of ‘Cloud’ as an IT notion emerged in the past years and proliferated within the business and IT professional community. The concept of cloud gained awareness both in the professional and scientific literature and in the practice of IT/IS world. The cloud has a profound impact on the Business Information Systems, especially on ERP systems. Indirectly, the cloud leads to a massive standardization on ERP systems and their services. In this paper, the authors provide a literature overview about the current situation of Cloud Computing and the requirements established by end-users against the other data processing facilities and systems, outstandingly the ERP systems. The majority of investigated cases are based on samples from Germany. Furthermore, the initial experiences of application are discussed. Separately, the recent selection objectives and criteria for ERP systems are investigated that came into existence because the appearance of Cloud in the IT environment.
Resumo:
The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^
Resumo:
This dissertation established a software-hardware integrated design for a multisite data repository in pediatric epilepsy. A total of 16 institutions formed a consortium for this web-based application. This innovative fully operational web application allows users to upload and retrieve information through a unique human-computer graphical interface that is remotely accessible to all users of the consortium. A solution based on a Linux platform with My-SQL and Personal Home Page scripts (PHP) has been selected. Research was conducted to evaluate mechanisms to electronically transfer diverse datasets from different hospitals and collect the clinical data in concert with their related functional magnetic resonance imaging (fMRI). What was unique in the approach considered is that all pertinent clinical information about patients is synthesized with input from clinical experts into 4 different forms, which were: Clinical, fMRI scoring, Image information, and Neuropsychological data entry forms. A first contribution of this dissertation was in proposing an integrated processing platform that was site and scanner independent in order to uniformly process the varied fMRI datasets and to generate comparative brain activation patterns. The data collection from the consortium complied with the IRB requirements and provides all the safeguards for security and confidentiality requirements. An 1-MR1-based software library was used to perform data processing and statistical analysis to obtain the brain activation maps. Lateralization Index (LI) of healthy control (HC) subjects in contrast to localization-related epilepsy (LRE) subjects were evaluated. Over 110 activation maps were generated, and their respective LIs were computed yielding the following groups: (a) strong right lateralization: (HC=0%, LRE=18%), (b) right lateralization: (HC=2%, LRE=10%), (c) bilateral: (HC=20%, LRE=15%), (d) left lateralization: (HC=42%, LRE=26%), e) strong left lateralization: (HC=36%, LRE=31%). Moreover, nonlinear-multidimensional decision functions were used to seek an optimal separation between typical and atypical brain activations on the basis of the demographics as well as the extent and intensity of these brain activations. The intent was not to seek the highest output measures given the inherent overlap of the data, but rather to assess which of the many dimensions were critical in the overall assessment of typical and atypical language activations with the freedom to select any number of dimensions and impose any degree of complexity in the nonlinearity of the decision space.
Resumo:
Accurately assessing the extent of myocardial tissue injury induced by Myocardial infarction (MI) is critical to the planning and optimization of MI patient management. With this in mind, this study investigated the feasibility of using combined fluorescence and diffuse reflectance spectroscopy to characterize a myocardial infarct at the different stages of its development. An animal study was conducted using twenty male Sprague-Dawley rats with MI. In vivo fluorescence spectra at 337 nm excitation and diffuse reflectance between 400 nm and 900 nm were measured from the heart using a portable fiber-optic spectroscopic system. Spectral acquisition was performed on (1) the normal heart region; (2) the region immediately surrounding the infarct; and (3) the infarcted region—one, two, three and four weeks into MI development. The spectral data were divided into six subgroups according to the histopathological features associated with various degrees/severities of myocardial tissue injury as well as various stages of myocardial tissue remodeling, post infarction. Various data processing and analysis techniques were employed to recognize the representative spectral features corresponding to various histopathological features associated with myocardial infarction. The identified spectral features were utilized in discriminant analysis to further evaluate their effectiveness in classifying tissue injuries induced by MI. In this study, it was observed that MI induced significant alterations (p < 0.05) in the diffuse reflectance spectra, especially between 450 nm and 600 nm, from myocardial tissue within the infarcted and surrounding regions. In addition, MI induced a significant elevation in fluorescence intensities at 400 and 460 nm from the myocardial tissue from the same regions. The extent of these spectral alterations was related to the duration of the infarction. Using the spectral features identified, an effective tissue injury classification algorithm was developed which produced a satisfactory overall classification result (87.8%). The findings of this research support the concept that optical spectroscopy represents a useful tool to non-invasively determine the in vivo pathophysiological features of a myocardial infarct and its surrounding tissue, thereby providing valuable real-time feedback to surgeons during various surgical interventions for MI.