885 resultados para Methods for Multi-criteria Evaluation


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Amniotic fluid (AF) was described as a potential source of mesenchymal stem cells (MSCs) for biomedicine purposes. Therefore, evaluation of alternative cryoprotectants and freezing protocols capable to maintain the viability and stemness of these cells after cooling is still needed. AF stem cells (AFSCs) were tested for different freezing methods and cryoprotectants. Cell viability, gene expression, surface markers, and plasticity were evaluated after thawing. AFSCs expressed undifferentiated genes Oct4 and Nanog; presented typical markers (CD29, CD44, CD90, and CD105) and were able to differentiate into mesenchymal lineages. All tested cryoprotectants preserved the features of AFSCs however, variations in cell viability were observed. In this concern, dimethyl sulfoxide (Me2SO) showed the best results. The freezing protocols tested did not promote significant changes in the AFSCs viability. Time programmed and nonprogrammed freezing methods could be used for successful AFSCs cryopreservation for 6 months. Although tested cryoprotectants maintained undifferentiated gene expression, typical markers, and plasticity of AFSCs, only Me2SO and glycerol presented workable viability ratios.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Although the release of cardiac biomarkers after percutaneous (PCI) or surgical revascularization (CABG) is common, its prognostic significance is not known. Questions remain about the mechanisms and degree of correlation between the release, the volume of myocardial tissue loss, and the long-term significance. Delayed-enhancement of cardiac magnetic resonance (CMR) consistently quantifies areas of irreversible myocardial injury. To investigate the quantitative relationship between irreversible injury and cardiac biomarkers, we will evaluate the extent of irreversible injury in patients undergoing PCI and CABG and relate it to postprocedural modifications in cardiac biomarkers and long-term prognosis. Methods/Design: The study will include 150 patients with multivessel coronary artery disease (CAD) with left ventricle ejection fraction (LVEF) and a formal indication for CABG; 50 patients will undergo CABG with cardiopulmonary bypass (CPB); 50 patients with the same arterial and ventricular condition indicated for myocardial revascularization will undergo CABG without CPB; and another 50 patients with CAD and preserved ventricular function will undergo PCI using stents. All patients will undergo CMR before and after surgery or PCI. We will also evaluate the release of cardiac markers of necrosis immediately before and after each procedure. Primary outcome considered is overall death in a 5-year follow-up. Secondary outcomes are levels of CK-MB isoenzyme and I-Troponin in association with presence of myocardial fibrosis and systolic left ventricle dysfunction assessed by CMR. Discussion: The MASS-V Trial aims to establish reliable values for parameters of enzyme markers of myocardial necrosis in the absence of manifest myocardial infarction after mechanical interventions. The establishments of these indices have diagnostic value and clinical prognosis and therefore require relevant and different therapeutic measures. In daily practice, the inappropriate use of these necrosis markers has led to misdiagnosis and therefore wrong treatment. The appearance of a more sensitive tool such as CMR provides an unprecedented diagnostic accuracy of myocardial damage when correlated with necrosis enzyme markers. We aim to correlate laboratory data with imaging, thereby establishing more refined data on the presence or absence of irreversible myocardial injury after the procedure, either percutaneous or surgical, and this, with or without the use of cardiopulmonary bypass.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work, the reduction reaction of paraquat herbicide was used to obtain analytical signals using electrochemical techniques of differential pulse voltammetry, square wave voltammetry and multiple square wave voltammetry. Analytes were prepared with laboratory purified water and natural water samples (from Mogi-Guacu River, SP). The electrochemical techniques were applied to 1.0 mol L-1 Na2SO4 solutions, at pH 5.5, and containing different concentrations of paraquat, in the range of 1 to 10 mu mol L-1, using a gold ultramicroelectrode. 5 replicate experiments were conducted and in each the mean value for peak currents obtained -0.70 V vs. Ag/AgCl yielded excellent linear relationships with pesticide concentrations. The slope values for the calibration plots (method sensitivity) were 4.06 x 10(-3), 1.07 x 10(-2) and 2.95 x 10(-2) A mol(-1) L for purified water by differential pulse voltammetry, square wave voltammetry and multiple square wave voltammetry, respectively. For river water samples, the slope values were 2.60 x 10(-3), 1.06 x 10(-2) and 3.35 x 10(-2) A mol(-1) L, respectively, showing a small interference from the natural matrix components in paraquat determinations. The detection limits for paraquat determinations were calculated by two distinct methodologies, i.e., as proposed by IUPAC and a statistical method. The values obtained with multiple square waves voltammetry were 0.002 and 0.12 mu mol L-1, respectively, for pure water electrolytes. The detection limit from IUPAC recommendations, when inserted in the calibration curve equation, an analytical signal (oxidation current) is smaller than the one experimentally observed for the blank solution under the same experimental conditions. This is inconsistent with the definition of detection limit, thus the IUPAC methodology requires further discussion. The same conclusion can be drawn by the analyses of detection limits obtained with the other techniques studied.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Background Spotted cDNA microarrays generally employ co-hybridization of fluorescently-labeled RNA targets to produce gene expression ratios for subsequent analysis. Direct comparison of two RNA samples in the same microarray provides the highest level of accuracy; however, due to the number of combinatorial pair-wise comparisons, the direct method is impractical for studies including large number of individual samples (e.g., tumor classification studies). For such studies, indirect comparisons using a common reference standard have been the preferred method. Here we evaluated the precision and accuracy of reconstructed ratios from three indirect methods relative to ratios obtained from direct hybridizations, herein considered as the gold-standard. Results We performed hybridizations using a fixed amount of Cy3-labeled reference oligonucleotide (RefOligo) against distinct Cy5-labeled targets from prostate, breast and kidney tumor samples. Reconstructed ratios between all tissue pairs were derived from ratios between each tissue sample and RefOligo. Reconstructed ratios were compared to (i) ratios obtained in parallel from direct pair-wise hybridizations of tissue samples, and to (ii) reconstructed ratios derived from hybridization of each tissue against a reference RNA pool (RefPool). To evaluate the effect of the external references, reconstructed ratios were also calculated directly from intensity values of single-channel (One-Color) measurements derived from tissue sample data collected in the RefOligo experiments. We show that the average coefficient of variation of ratios between intra- and inter-slide replicates derived from RefOligo, RefPool and One-Color were similar and 2 to 4-fold higher than ratios obtained in direct hybridizations. Correlation coefficients calculated for all three tissue comparisons were also similar. In addition, the performance of all indirect methods in terms of their robustness to identify genes deemed as differentially expressed based on direct hybridizations, as well as false-positive and false-negative rates, were found to be comparable. Conclusion RefOligo produces ratios as precise and accurate as ratios reconstructed from a RNA pool, thus representing a reliable alternative in reference-based hybridization experiments. In addition, One-Color measurements alone can reconstruct expression ratios without loss in precision or accuracy. We conclude that both methods are adequate options in large-scale projects where the amount of a common reference RNA pool is usually restrictive.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hierarchical multi-label classification is a complex classification task where the classes involved in the problem are hierarchically structured and each example may simultaneously belong to more than one class in each hierarchical level. In this paper, we extend our previous works, where we investigated a new local-based classification method that incrementally trains a multi-layer perceptron for each level of the classification hierarchy. Predictions made by a neural network in a given level are used as inputs to the neural network responsible for the prediction in the next level. We compare the proposed method with one state-of-the-art decision-tree induction method and two decision-tree induction methods, using several hierarchical multi-label classification datasets. We perform a thorough experimental analysis, showing that our method obtains competitive results to a robust global method regarding both precision and recall evaluation measures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present study we are using multi variate analysis techniques to discriminate signal from background in the fully hadronic decay channel of ttbar events. We give a brief introduction to the role of the Top quark in the standard model and a general description of the CMS Experiment at LHC. We have used the CMS experiment computing and software infrastructure to generate and prepare the data samples used in this analysis. We tested the performance of three different classifiers applied to our data samples and used the selection obtained with the Multi Layer Perceptron classifier to give an estimation of the statistical and systematical uncertainty on the cross section measurement.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the risk shift. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of conceptual and basic design, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing out of control conditions. In the assessment of layout plans, ad hoc tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Which event study methods are best in non-U.S. multi-country samples? Nonparametric tests, especially the rank and generalized sign, are better specified and more powerful than common parametric tests, especially in multi-day windows. The generalized sign test is the best statistic but must be applied to buy-and-hold abnormal returns for correct specification. Market-adjusted and market-model methods with local market indexes, without conversion to a common currency, work well. The results are robust to limiting the samples to situations expected to be problematic for test specification or power. Applying the tests that perform best in simulation to merger announcements produces reasonable results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Research in art conservation has been developed from the early 1950s, giving a significant contribution to the conservation-restoration of cultural heritage artefacts. In fact, only through a profound knowledge about the nature and conditions of constituent materials, suitable decisions on the conservation and restoration measures can thus be adopted and preservation practices enhanced. The study of ancient artworks is particularly challenging as they can be considered as heterogeneous and multilayered systems where numerous interactions between the different components as well as degradation and ageing phenomena take place. However, difficulties to physically separate the different layers due to their thickness (1-200 m) can result in the inaccurate attribution of the identified compounds to a specific layer. Therefore, details can only be analysed when the sample preparation method leaves the layer structure intact, as for example the preparation of embedding cross sections in synthetic resins. Hence, spatially resolved analytical techniques are required not only to exactly characterize the nature of the compounds but also to obtain precise chemical and physical information about ongoing changes. This thesis focuses on the application of FTIR microspectroscopic techniques for cultural heritage materials. The first section is aimed at introducing the use of FTIR microscopy in conservation science with a particular attention to the sampling criteria and sample preparation methods. The second section is aimed at evaluating and validating the use of different FTIR microscopic analytical methods applied to the study of different art conservation issues which may be encountered dealing with cultural heritage artefacts: the characterisation of the artistic execution technique (chapter II-1), the studies on degradation phenomena (chapter II-2) and finally the evaluation of protective treatments (chapter II-3). The third and last section is divided into three chapters which underline recent developments in FTIR spectroscopy for the characterisation of paint cross sections and in particular thin organic layers: a newly developed preparation method with embedding systems in infrared transparent salts (chapter III-1), the new opportunities offered by macro-ATR imaging spectroscopy (chapter III-2) and the possibilities achieved with the different FTIR microspectroscopic techniques nowadays available (chapter III-3). In chapter II-1, FTIR microspectroscopy as molecular analysis, is presented in an integrated approach with other analytical techniques. The proposed sequence is optimized in function of the limited quantity of sample available and this methodology permits to identify the painting materials and characterise the adopted execution technique and state of conservation. Chapter II-2 describes the characterisation of the degradation products with FTIR microscopy since the investigation on the ageing processes encountered in old artefacts represents one of the most important issues in conservation research. Metal carboxylates resulting from the interaction between pigments and binding media are characterized using synthesised metal palmitates and their production is detected on copper-, zinc-, manganese- and lead- (associated with lead carbonate) based pigments dispersed either in oil or egg tempera. Moreover, significant effects seem to be obtained with iron and cobalt (acceleration of the triglycerides hydrolysis). For the first time on sienna and umber paints, manganese carboxylates are also observed. Finally in chapter II-3, FTIR microscopy is combined with further elemental analyses to characterise and estimate the performances and stability of newly developed treatments, which should better fit conservation-restoration problems. In the second part, in chapter III-1, an innovative embedding system in potassium bromide is reported focusing on the characterisation and localisation of organic substances in cross sections. Not only the identification but also the distribution of proteinaceous, lipidic or resinaceous materials, are evidenced directly on different paint cross sections, especially in thin layers of the order of 10 m. Chapter III-2 describes the use of a conventional diamond ATR accessory coupled with a focal plane array to obtain chemical images of multi-layered paint cross sections. A rapid and simple identification of the different compounds is achieved without the use of any infrared microscope objectives. Finally, the latest FTIR techniques available are highlighted in chapter III-3 in a comparative study for the characterisation of paint cross sections. Results in terms of spatial resolution, data quality and chemical information obtained are presented and in particular, a new FTIR microscope equipped with a linear array detector, which permits reducing the spatial resolution limit to approximately 5 m, provides very promising results and may represent a good alternative to either mapping or imaging systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis evaluated in vivo and in vitro enamel permeability in different physiological and clinical conditions by means of SEM inspection of replicas of enamel surface obtained from polyvinyl siloxane impressions subsequently later cast in polyether impression ma-terial. This technique, not invasive and risk-free, allows the evaluation of fluid outflow from enamel surface and is able to detect the presence of small quantities of fluid, visu-alized as droplets. Fluid outflow on enamel surface represents enamel permeability. This property has a paramount importance in enamel physiolgy and pathology although its ef-fective role in adhesion, caries pathogenesis and prevention today is still not fully under-stood. The aim of the studies proposed was to evaluate enamel permeability changes in differ-ent conditions and to correlate the findings with the actual knowledge about enamel physiology, caries pathogenesis, fluoride and etchinhg treatments. To obtain confirmed data the replica technique has been supported by others specific techniques such as Ra-man and IR spectroscopy and EDX analysis. The first study carried out visualized fluid movement through dental enamel in vivo con-firmed that enamel is a permeable substrate and demonstrated that age and enamel per-meability are closely related. Examined samples from subjects of different ages showed a decreasing number and size of droplets with increasing age: freshly erupted permanent teeth showed many droplets covering the entire enamel surface. Droplets in permanent teeth were prominent along enamel perikymata. These results obtained through SEM inspection of replicas allowed innovative remarks in enamel physiology. An analogous testing has been developed for evaluation of enamel permeability in primary enamel. The results of this second study showed that primary enamel revealed a substantive permeability with droplets covering the entire enamel sur-face without any specific localization accordingly with histological features, without changes during aging signs of post-eruptive maturation. These results confirmed clinical data that showed a higher caries susceptibility for primary enamel and suggested a strong relationship between this one and enamel permeability. Topical fluoride application represents the gold standard for caries prevention although the mechanism of cariostatic effect of fluoride still needs to be clarified. The effects of topical fluoride application on enamel permeability were evaluated. Particularly two dif-ferent treatments (NaF and APF), with different pH, were examined. The major product of topical fluoride application was the deposition of CaF2-like globules. Replicas inspec-tion before and after both treatments at different times intervals and after specific addi-tional clinical interventions showed that such globule formed in vivo could be removed by professional toothbrushing, sonically and chemically by KOH. The results obtained in relation to enamel permeability showed that fluoride treatments temporarily reduced enamel water permeability when CaF2-like globules were removed. The in vivo perma-nence of decreased enamel permeability after CaF2 globules removal has been demon-strated for 1 h for NaF treated teeth and for at least 7 days for APF treated teeth. Important clinical consideration moved from these results. In fact the caries-preventing action of fluoride application may be due, in part, to its ability to decrease enamel water permeability and CaF2 like-globules seem to be indirectly involved in enamel protection over time maintaining low permeability. Others results obtained by metallographic microscope and SEM/EDX analyses of or-thodontic resins fluoride releasing and not demonstrated the relevance of topical fluo-ride application in decreasing the demineralization marks and modifying the chemical composition of the enamel in the treated area. These data obtained in both the experiments confirmed the efficacy of fluoride in caries prevention and contribute to clarify its mechanism of action. Adhesive dentistry is the gold standard for caries treatment and tooth rehabilitation and is founded on important chemical and physical principles involving both enamel and dentine substrates. Particularly acid etching of dental enamel enamel has usually employed in bonding pro-cedures increasing microscopic roughness. Different acids have been tested in the litera-ture suggesting several etching procedures. The acid-induced structural transformations in enamel after different etching treatments by means of Raman and IR spectroscopy analysis were evaluated and these findings were correlated with enamel permeability. Conventional etching with 37% phosphoric acid gel (H3PO4) for 30 s and etching with 15 % HCl for 120 s were investigated. Raman and IR spectroscopy showed that the treatment with both hydrochloric and phosphoric acids induced a decrease in the carbonate content of the enamel apatite. At the same time, both acids induced the formation of HPO42- ions. After H3PO4 treatment the bands due to the organic component of enamel decreased in intensity, while in-creased after HCl treatment. Replicas of H3PO4 treated enamel showed a strongly reduced permeability while replicas of HCl 15% treated samples showed a maintained permeability. A decrease of the enamel organic component, as resulted after H3PO4 treatment, involves a decrease in enamel permeability, while the increase of the organic matter (achieved by HCl treat-ment) still maintains enamel permeability. These results suggested a correlation between the amount of the organic matter, enamel permeability and caries. The results of the different studies carried out in this thesis contributed to clarify and improve the knowledge about enamel properties with important rebounds in theoretical and clinical aspects of Dentistry.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Alzheimer's disease (AD) and cancer represent two of the main causes of death worldwide. They are complex multifactorial diseases and several biochemical targets have been recognized to play a fundamental role in their development. Basing on their complex nature, a promising therapeutical approach could be represented by the so-called "Multi-Target-Directed Ligand" approach. This new strategy is based on the assumption that a single molecule could hit several targets responsible for the onset and/or progression of the pathology. In particular in AD, most currently prescribed drugs aim to increase the level of acetylcholine in the brain by inhibiting the enzyme acetylcholinesterase (AChE). However, clinical experience shows that AChE inhibition is a palliative treatment, and the simple modulation of a single target does not address AD aetiology. Research into newer and more potent anti-AD agents is thus focused on compounds whose properties go beyond AChE inhibition (such as inhibition of the enzyme -secretase and inhibition of the aggregation of beta-amyloid). Therefore, the MTDL strategy seems a more appropriate approach for addressing the complexity of AD and may provide new drugs for tackling its multifactorial nature. In this thesis, it is described the design of new MTDLs able to tackle the multifactorial nature of AD. Such new MTDLs designed are less flexible analogues of Caproctamine, one of the first MTDL owing biological properties useful for the AD treatment. These new compounds are able to inhibit the enzymes AChE, beta-secretase and to inhibit both AChE-induced and self-induced beta-amyloid aggregation. In particular, the most potent compound of the series is able to inhibit AChE in subnanomolar range, to inhibit -secretase in micromolar concentration and to inhibit both AChE-induced and self-induced beta-amyloid aggregation in micromolar concentration. Cancer, as AD, is a very complex pathology and many different therapeutical approaches are currently use for the treatment of such pathology. However, due to its multifactorial nature the MTDL approach could be, in principle, apply also to this pathology. Aim of this thesis has been the development of new molecules owing different structural motifs able to simultaneously interact with some of the multitude of targets responsible for the pathology. The designed compounds displayed cytotoxic activity in different cancer cell lines. In particular, the most potent compounds of the series have been further evaluated and they were able to bind DNA resulting 100-fold more potent than the reference compound Mitonafide. Furthermore, these compounds were able to trigger apoptosis through caspases activation and to inhibit PIN1 (preliminary result). This last protein is a very promising target because it is overexpressed in many human cancers, it functions as critical catalyst for multiple oncogenic pathways and in several cancer cell lines depletion of PIN1 determines arrest of mitosis followed by apoptosis induction. In conclusion, this study may represent a promising starting pint for the development of new MTDLs hopefully useful for cancer and AD treatment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Different tools have been used to set up and adopt the model for the fulfillment of the objective of this research. 1. The Model The base model that has been used is the Analytical Hierarchy Process (AHP) adapted with the aim to perform a Benefit Cost Analysis. The AHP developed by Thomas Saaty is a multicriteria decision - making technique which decomposes a complex problem into a hierarchy. It is used to derive ratio scales from both discreet and continuous paired comparisons in multilevel hierarchic structures. These comparisons may be taken from actual measurements or from a fundamental scale that reflects the relative strength of preferences and feelings. 2. Tools and methods 2.1. The Expert Choice Software The software Expert Choice is a tool that allows each operator to easily implement the AHP model in every stage of the problem. 2.2. Personal Interviews to the farms For this research, the farms of the region Emilia Romagna certified EMAS have been detected. Information has been given by EMAS center in Wien. Personal interviews have been carried out to each farm in order to have a complete and realistic judgment of each criteria of the hierarchy. 2.3. Questionnaire A supporting questionnaire has also been delivered and used for the interviews . 3. Elaboration of the data After data collection, the data elaboration has taken place. The software support Expert Choice has been used . 4. Results of the Analysis The result of the figures above (vedere altro documento) gives a series of numbers which are fractions of the unit. This has to be interpreted as the relative contribution of each element to the fulfillment of the relative objective. So calculating the Benefits/costs ratio for each alternative the following will be obtained: Alternative One: Implement EMAS Benefits ratio: 0, 877 Costs ratio: 0, 815 Benfit/Cost ratio: 0,877/0,815=1,08 Alternative Two: Not Implement EMAS Benefits ratio: 0,123 Costs ration: 0,185 Benefit/Cost ratio: 0,123/0,185=0,66 As stated above, the alternative with the highest ratio will be the best solution for the organization. This means that the research carried out and the model implemented suggests that EMAS adoption in the agricultural sector is the best alternative. It has to be noted that the ratio is 1,08 which is a relatively low positive value. This shows the fragility of this conclusion and suggests a careful exam of the benefits and costs for each farm before adopting the scheme. On the other part, the result needs to be taken in consideration by the policy makers in order to enhance their intervention regarding the scheme adoption on the agricultural sector. According to the AHP elaboration of judgments we have the following main considerations on Benefits: - Legal compliance seems to be the most important benefit for the agricultural sector since its rank is 0,471 - The next two most important benefits are Improved internal organization (ranking 0,230) followed by Competitive advantage (ranking 0, 221) mostly due to the sub-element Improved image (ranking 0,743) Finally, even though Incentives are not ranked among the most important elements, the financial ones seem to have been decisive on the decision making process. According to the AHP elaboration of judgments we have the following main considerations on Costs: - External costs seem to be largely more important than the internal ones (ranking 0, 857 over 0,143) suggesting that Emas costs over consultancy and verification remain the biggest obstacle. - The implementation of the EMS is the most challenging element regarding the internal costs (ranking 0,750).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the instrument familiarization plan), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Despite several clinical tests that have been developed to qualitatively describe complex motor tasks by functional testing, these methods often depend on clinicians' interpretation, experience and training, which make the assessment results inconsistent, without the precision required to objectively assess the effect of the rehabilitative intervention. A more detailed characterization is required to fully capture the various aspects of motor control and performance during complex movements of lower and upper limbs. The need for cost-effective and clinically applicable instrumented tests would enable quantitative assessment of performance on a subject-specific basis, overcoming the limitations due to the lack of objectiveness related to individual judgment, and possibly disclosing subtle alterations that are not clearly visible to the observer. Postural motion measurements at additional locations, such as lower and upper limbs and trunk, may be necessary in order to obtain information about the inter-segmental coordination during different functional tests involved in clinical practice. With these considerations in mind, this Thesis aims: i) to suggest a novel quantitative assessment tool for the kinematics and dynamics evaluation of a multi-link kinematic chain during several functional motor tasks (i.e. squat, sit-to-stand, postural sway), using one single-axis accelerometer per segment, ii) to present a novel quantitative technique for the upper limb joint kinematics estimation, considering a 3-link kinematic chain during the Fugl-Meyer Motor Assessment and using one inertial measurement unit per segment. The suggested methods could have several positive feedbacks from clinical practice. The use of objective biomechanical measurements, provided by inertial sensor-based technique, may help clinicians to: i) objectively track changes in motor ability, ii) provide timely feedback about the effectiveness of administered rehabilitation interventions, iii) enable intervention strategies to be modified or changed if found to be ineffective, and iv) speed up the experimental sessions when several subjects are asked to perform different functional tests.