970 resultados para Data Warehouse, Decision Rule, Quality Assessment


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proposal to work on this final project came after several discussions held with Dr. Elzbieta Malinowski Gadja, who in 2008 published the book entitled Advanced Data Warehouse Design: From Conventional to Spatial and Temporal Applications (Data-Centric Systems and Applications). The project was carried out under the technical supervision of Dr. Malinowski and the direct beneficiary was the University of Costa Rica (UCR) where Dr. Malinowski is a professor at the Department of Computer Science and Informatics. The purpose of this project was twofold: First, to translate chapter III of said book with the intention of generating educational material for the use of the UCR and, second, to venture in the field of technical translation related to data warehouse. For the first component, the goal was to generate a final product that would eventually serve as an educational tool for the post-graduate courses of the UCR. For the second component, this project allowed me to acquire new skills and put into practice techniques that have helped me not only to perfom better in my current job as an Assistant Translator of the Inter-American BAnk (IDB), but also to use them in similar projects. The process was lenggthy and required torough research and constant communication with the author. The investigation focused on the search of terms and definitions to prepare the glossary, which was the basis to start the translation project. The translation process itself was carried out by phases, so that comments and corrections by the author could be taken into account in subsequent stages. Later, based on the glossary and the translated text, illustrations had been created in the Visio software were translated. In addition to the technical revision by the author, professor Carme Mangiron was in charge of revising the non-technical text. The result was a high-quality document that is currently used as reference and study material by the Department of Computer Science and Informatics of Costa Rica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality control (QuaCo) in urology is mandatory to standardize or even increase the level of care. While QuaCo is undertaken at every step in the clinical pathway, it should focus on the patient's comorbidities and on the urologist and its complication rate. Resulting from political and economical pressures, comparing QuaCo and outcomes between urologists and institutions is nowadays often performed. However, careful interpretation of these comparisons is mandatory to avoid potential discriminations. Indeed, the reader has to make sure that patients groups and surgical techniques are comparable, definitions of complications are similar, classification of complications is standardized, and finally that the methodology in collecting data is irreproachable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MRI has evolved into an important diagnostic technique in medical imaging. However, reliability of the derived diagnosis can be degraded by artifacts, which challenge both radiologists and automatic computer-aided diagnosis. This work proposes a fully-automatic method for measuring image quality of three-dimensional (3D) structural MRI. Quality measures are derived by analyzing the air background of magnitude images and are capable of detecting image degradation from several sources, including bulk motion, residual magnetization from incomplete spoiling, blurring, and ghosting. The method has been validated on 749 3D T(1)-weighted 1.5T and 3T head scans acquired at 36 Alzheimer's Disease Neuroimaging Initiative (ADNI) study sites operating with various software and hardware combinations. Results are compared against qualitative grades assigned by the ADNI quality control center (taken as the reference standard). The derived quality indices are independent of the MRI system used and agree with the reference standard quality ratings with high sensitivity and specificity (>85%). The proposed procedures for quality assessment could be of great value for both research and routine clinical imaging. It could greatly improve workflow through its ability to rule out the need for a repeat scan while the patient is still in the magnet bore.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli selvitt, onko tutkielman tilaajan toteuttaman kannattavuusraportoinnin laatu kyttjien mielest riittv. Kannattavuusraportointi on toteutettu data warehouse tekniikalla. Tutkielman tavoitteina oli mys mritt, mit ohjelmiston laatu tarkoittaa ja miten sit voidaan arvioida. Tutkimuksessa kytettiin kvalitatiivista tutkimusmenetelm. Laadun arviointiin kytetty aineisto kerttiin haastattelemalla seitsemtoista kannattavuusraportoinnin aktiivikyttj. Tutkielmassa ohjelmiston laatu tarkoittaa sen kyky tytt tai ylitt kyttjiens kohtuulliset toiveet ja odotukset. Laatua arvioitiin standardin ISO/IEC 9126 mrittelemll kuudella laatuominaisuudella, jotka kuvaavat minimaalisella pllekkisyydell ohjelmiston laadun. Lisksi arvioinnissa hydynnettiin varsinaiseen standardiin kuulumatonta informatiivista liitett, joka tarkentaa ISO/IEC 9126 standardissa esitettyj laadun ominaispiirteit. Tutkimuksen tuloksena voidaan todeta, ett kyttjien mukaan kannattavuusraportointi on tarpeeksi laadukas, sill se pystyy tarjoamaan helppokyttisi, oikeanmuotoisia raportteja riittvn hyvll vasteajalla kyttjien tarpeisiin. Tehokkaasta hydyntmisest voidaan ptell data warehousen rakentamisen onnistuneen. Tutkimuksessa nousi esiin mys runsaasti kehittmis- ja parannusideoita, jotka toimivat yhten kehitystyn apuvlineen tulevaisuudessa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: This article is part of a research study on the organization of primary health care (PHC) for mental health in two of Quebec's remote regions. It introduces a methodological approach based on information found in health records, for assessing the quality of PHC offered to people suffering from depression or anxiety disorders. METHODS: Quality indicators were identified from evidence and case studies were reconstructed using data collected in health records over a 2-year observation period. Data collection was developed using a three-step iterative process: (1) feasibility analysis, (2) development of a data collection tool, and (3) application of the data collection method. The adaptation of quality-of-care indicators to remote regions was appraised according to their relevance, measurability and construct validity in this context. RESULTS: As a result of this process, 18 quality indicators were shown to be relevant, measurable and valid for establishing a critical quality appraisal of four recommended dimensions of PHC clinical processes: recognition, assessment, treatment and follow-up. CONCLUSIONS: There is not only an interest in the use of health records to assess the quality of PHC for mental health in remote regions but also a scientific value for the rigorous and meticulous methodological approach developed in this study. From the perspective of stakeholders in the PHC system of care in remote areas, quality indicators are credible and provide potential for transferability to other contexts. This study brings information that has the potential to identify gaps in and implement solutions adapted to the context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proposal to work on this final project came after several discussions held with Dr. Elzbieta Malinowski Gadja, who in 2008 published the book entitled Advanced Data Warehouse Design: From Conventional to Spatial and Temporal Applications (Data-Centric Systems and Applications). The project was carried out under the technical supervision of Dr. Malinowski and the direct beneficiary was the University of Costa Rica (UCR) where Dr. Malinowski is a professor at the Department of Computer Science and Informatics. The purpose of this project was twofold: First, to translate chapter III of said book with the intention of generating educational material for the use of the UCR and, second, to venture in the field of technical translation related to data warehouse. For the first component, the goal was to generate a final product that would eventually serve as an educational tool for the post-graduate courses of the UCR. For the second component, this project allowed me to acquire new skills and put into practice techniques that have helped me not only to perfom better in my current job as an Assistant Translator of the Inter-American BAnk (IDB), but also to use them in similar projects. The process was lenggthy and required torough research and constant communication with the author. The investigation focused on the search of terms and definitions to prepare the glossary, which was the basis to start the translation project. The translation process itself was carried out by phases, so that comments and corrections by the author could be taken into account in subsequent stages. Later, based on the glossary and the translated text, illustrations had been created in the Visio software were translated. In addition to the technical revision by the author, professor Carme Mangiron was in charge of revising the non-technical text. The result was a high-quality document that is currently used as reference and study material by the Department of Computer Science and Informatics of Costa Rica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After decades of mergers and acquisitions and successive technology trends such as CRM, ERP and DW, the data in enterprise systems is scattered and inconsistent. Global organizations face the challenge of addressing local uses of shared business entities, such as customer and material, and at the same time have a consistent, unique, and consolidate view of financial indicators. In addition, current enterprise systems do not accommodate the pace of organizational changes and immense efforts are required to maintain data. When it comes to systems integration, ERPs are considered closed and expensive. Data structures are complex and the out-of-the-box integration options offered are not based on industry standards. Therefore expensive and time-consuming projects are undertaken in order to have required data flowing according to business processes needs. Master Data Management (MDM) emerges as one discipline focused on ensuring long-term data consistency. Presented as a technology-enabled business discipline, it emphasizes business process and governance to model and maintain the data related to key business entities. There are immense technical and organizational challenges to accomplish the single version of the truth MDM mantra. Adding one central repository of master data might prove unfeasible in a few scenarios, thus an incremental approach is recommended, starting from areas most critically affected by data issues. This research aims at understanding the current literature on MDM and contrasting it with views from professionals. The data collected from interviews revealed details on the complexities of data structures and data management practices in global organizations, reinforcing the call for more in-depth research on organizational aspects of MDM. The most difficult piece of master data to manage is the local part, the attributes related to the sourcing and storing of materials in one particular warehouse in The Netherlands or a complex set of pricing rules for a subsidiary of a customer in Brazil. From a practical perspective, this research evaluates one MDM solution under development at a Finnish IT solution-provider. By means of applying an existing assessment method, the research attempts at providing the company with one possible tool to evaluate its product from a vendor-agnostics perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Baking and 2-g mixograph analyses were performed for 55 cultivars (19 spring and 36 winter wheat) from various quality classes from the 2002 harvest in Poland. An instrumented 2-g direct-drive mixograph was used to study the mixing characteristics of the wheat cultivars. A number of parameters were extracted automatically from each mixograph trace and correlated with baking volume and flour quality parameters (protein content and high molecular weight glutenin subunit [HMW-GS] composition by SDS-PAGE) using multiple linear regression statistical analysis. Principal component analysis of the mixograph data discriminated between four flour quality classes, and predictions of baking volume were obtained using several selected mixograph parameters, chosen using a best subsets regression routine, giving R-2 values of 0.862-0.866. In particular, three new spring wheat strains (CHD 502a-c) recently registered in Poland were highly discriminated and predicted to give high baking volume on the basis of two mixograph parameters: peak bandwidth and 10-min bandwidth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The estimation of prediction quality is important because without quality measures, it is difficult to determine the usefulness of a prediction. Currently, methods for ligand binding site residue predictions are assessed in the function prediction category of the biennial Critical Assessment of Techniques for Protein Structure Prediction (CASP) experiment, utilizing the Matthews Correlation Coefficient (MCC) and Binding-site Distance Test (BDT) metrics. However, the assessment of ligand binding site predictions using such metrics requires the availability of solved structures with bound ligands. Thus, we have developed a ligand binding site quality assessment tool, FunFOLDQA, which utilizes protein feature analysis to predict ligand binding site quality prior to the experimental solution of the protein structures and their ligand interactions. The FunFOLDQA feature scores were combined using: simple linear combinations, multiple linear regression and a neural network. The neural network produced significantly better results for correlations to both the MCC and BDT scores, according to Kendalls , Spearmans and Pearsons r correlation coefficients, when tested on both the CASP8 and CASP9 datasets. The neural network also produced the largest Area Under the Curve score (AUC) when Receiver Operator Characteristic (ROC) analysis was undertaken for the CASP8 dataset. Furthermore, the FunFOLDQA algorithm incorporating the neural network, is shown to add value to FunFOLD, when both methods are employed in combination. This results in a statistically significant improvement over all of the best server methods, the FunFOLD method (6.43%), and one of the top manual groups (FN293) tested on the CASP8 dataset. The FunFOLDQA method was also found to be competitive with the top server methods when tested on the CASP9 dataset. To the best of our knowledge, FunFOLDQA is the first attempt to develop a method that can be used to assess ligand binding site prediction quality, in the absence of experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The region of Toledo River, Parana, Brazil is characterized by intense anthropogenic activities. Hence, metal concentrations and physical-chemical parameters of Toledo River water were determined in order to complete an environmental evaluation catalog. Samples were collected monthly during one year period at seven different sites from the source down the river mouth, physical-chemical variables were analyzed, and major metallic ions were measured. Metal analysis was performed by using the synchrotron radiation total reflection X-ray fluorescence technique. A statistical analysis was applied to evaluate the reliability of experimental data. The analysis of obtained results have shown that a strong correlation between physical-chemical parameters existed among sites 1 and 7, suggesting that organic pollutants were mainly responsible for decreasing the Toledo River water quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Program directors and department chairs require different means of assessing faculty quality due to the unreliability of student course evaluation data. This report outlines alternative strategies for review committees to assess faculty instructional quality. This report also details incorporation of annual performance reviews for tenure-track faculty into tenure decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenao de Aperfeioamento de Pessoal de Nvel Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this work was to establish comparisons among environmental degradation in different areas from Southern Spain (Gulf of Cadiz) and Brazil (Santos and Sao Vicente estuary), by using principal component analyses (PCA) to integrate sediment toxicity (amphipods mortality) and chemical-physical data (Zn, Cd, Pb; Cu, Ni, Co, V, PCBs, PAHs concentrations, OC and fines contents). The results of PCA extraction of Spanish data showed that Bay of Cadiz, CA-1 did not present contamination or degradation; CA-2 exhibited contamination by PCBs, however it was not related to the amphipods mortality. Ria of Huelva was the most impacted site, showing contamination caused principally by hydrocarbons, in HV-1 and HV-2, but heavy metals were also important contaminants at HV-1, HV-2 and HV-3. Algeciras Bay was considered as not degraded in GR-3 and -4, but in GR-3' high contamination by PAHs was found. In the Brazilian area, the most degraded sediments were found in the stations situated at the inner parts of the estuary (SSV-2, SSV-3, and SSV-4), followed by SSV-6, which is close to the Submarine Sewage Outfall of Santos - SSOS. Sediments from SSV-1 and SSV-5 did not present chemical contamination, organic contamination or significant amphipod mortality. The results, of this investigation showed that both countries present environmental degradation related to PAHs: in Spain, at Ria of Huelva and Gudarranque river's estuary areas; and in Brasil, in the internal portion of the Santos and Sao Vicente estuary. The same situation is found for heavy metals, since all of the identified metals are related to toxicity in the studied areas, with few exceptions (V for both Brazil and Spain, and Cd and Co for Brazilian areas). The contamination by PCBs is more serious for Santos and Sao Vicente estuary than for the investigated areas in Gulf of Cadiz, where such compound did not relate to the toxicity. (c) 2006 Elsevier Ltd. All rights reserved.