836 resultados para Methodological standards


Relevância:

20.00% 20.00%

Publicador:

Resumo:

3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many physiological and pathological processes are mediated by the activity of proteins assembled in homo and/or hetero-oligomers. The correct recognition and association of these proteins into a functional complex is a key step determining the fate of the whole pathway. This has led to an increasing interest in selecting molecules able to modulate/inhibit these protein-protein interactions. In particular, our research was focused on Heat Shock Protein 90 (Hsp90), responsible for the activation and maturation and disposition of many client proteins [1], [2] [3]. Circular Dichroism (CD) spectroscopy, Surface Plasmon Resonance (SPR) and Affinity Capillary Electrophoresis (ACE) were used to characterize the Hsp90 target and, furthermore, its inhibition process via C-terminal domain driven by the small molecule Coumermycin A1. Circular Dichroism was used as powerful technique to characterize Hsp90 and its co-chaperone Hop in solution for secondary structure content, stability to different pHs, temperatures and solvents. Furthermore, CD was used to characterize ATP but, unfortunately, we were not able to monitor an interaction between ATP and Hsp90. The utility of SPR technology, on the other hand, arises from the possibility of immobilizing the protein on a chip through its N-terminal domain to later study the interaction with small molecules able to disrupt the Hsp90 dimerization on the C-terminal domain. The protein was attached on SPR chip using the “amine coupling” chemistry so that the C-terminal domain was free to interact with Coumermycin A1. The goal of the experiment was achieved by testing a range of concentrations of the small molecule Coumermycin A1. Despite to the large difference in the molecular weight of the protein (90KDa) and the drug (1110.08 Da), we were able to calculate the affinity constant of the interaction that was found to be 11.2 µm. In order to confirm the binding constant calculated for the Hsp90 on the chip, we decided to use Capillary Electrophoresis to test the Coumermycin binding to Hsp90. First, this technique was conveniently used to characterize the Hsp90 sample in terms of composition and purity. The experimental conditions were settled on two different systems, the bared fused silica and the PVA-coated capillary. We were able to characterize the Hsp90 sample in both systems. Furthermore, we employed an application of capillary electrophoresis, the Affinity Capillary Electrophoresis (ACE), to measure and confirm the binding constant calculated for Coumermycin on Optical Biosensor. We found a KD = 19.45 µM. This result compares favorably with the KD previously obtained on biosensor. This is a promising result for the use of our novel approach to screen new potential inhibitors of Hsp90 C-terminal domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agri-food supply chains extend beyond national boundaries, partially facilitated by a policy environment that encourages more liberal international trade. Rising concentration within the downstream sector has driven a shift towards “buyer-driven” global value chains (GVCs) extending internationally with global sourcing and the emergence of multinational key economic players that compete with increase emphasis on product quality attributes. Agri-food systems are thus increasingly governed by a range of inter-related public and private standards, both of which are becoming a priori mandatory, especially in supply chains for high-value and quality-differentiated agri-food products and tend to strongly affect upstream agricultural practices, firms’ internal organization and strategic behaviour and to shape the food chain organization. Notably, increasing attention has been given to the impact of SPS measures on agri-food trade and notably on developing countries’ export performance. Food and agricultural trade is the vital link in the mutual dependency of the global trade system and developing countries. Hence, developing countries derive a substantial portion of their income from food and agricultural trade. In Morocco, fruit and vegetable (especially fresh) are the primary agricultural export. Because of the labor intensity, this sector (especially citrus and tomato) is particularly important in terms of income and employment generation, especially for the female laborers hired in the farms and packing houses. Hence, the emergence of agricultural and agrifood product safety issues and the subsequent tightening of market requirements have challenged mutual gains due to the lack of technical and financial capacities of most developing countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research was designed to answer the question of which direction the restructuring of financial regulators should take – consolidation or fragmentation. This research began by examining the need for financial regulation and its related costs. It then continued to describe what types of regulatory structures exist in the world; surveying the regulatory structures in 15 jurisdictions, comparing them and discussing their strengths and weaknesses. This research analyzed the possible regulatory structures using three methodological tools: Game-Theory, Institutional-Design, and Network-Effects. The incentives for regulatory action were examined in Chapter Four using game theory concepts. This chapter predicted how two regulators with overlapping supervisory mandates will behave in two different states of the world (where they can stand to benefit from regulating and where they stand to lose). The insights derived from the games described in this chapter were then used to analyze the different supervisory models that exist in the world. The problem of information-flow was discussed in Chapter Five using tools from institutional design. The idea is based on the need for the right kind of information to reach the hands of the decision maker in the shortest time possible in order to predict, mitigate or stop a financial crisis from occurring. Network effects and congestion in the context of financial regulation were discussed in Chapter Six which applied the literature referring to network effects in general in an attempt to conclude whether consolidating financial regulatory standards on a global level might also yield other positive network effects. Returning to the main research question, this research concluded that in general the fragmented model should be preferable to the consolidated model in most cases as it allows for greater diversity and information-flow. However, in cases in which close cooperation between two authorities is essential, the consolidated model should be used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The public awareness that chemical substances are present ubiquitously in the environment, can be assumed through the diet and can exhibit various health effects, is very high in Europe and Italy. National and international institutions are called to provide figures on the magnitude, frequency, and duration of the population exposure to chemicals, including both natural or anthropogenic substances, voluntarily added to consumers’ good or accidentally entering the production chains. This thesis focuses broadly on how human population exposure to chemicals can be estimated, with particular attention to the methodological approaches and specific focus on dietary exposure assessment and biomonitoring. From the results obtained in the different studies collected in this thesis, it has been pointed out that when selecting the approach to use for the estimate of the exposure to chemicals, several different aspects must be taken into account: the nature of the chemical substance, the population of interest, clarify if the objective is to assess chronic or acute exposure, and finally, take into account the quality and quantity of data available in order to specify and quantify the uncertainty of the estimate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the first chapter, I develop a panel no-cointegration test which extends Pesaran, Shin and Smith (2001)'s bounds test to the panel framework by considering the individual regressions in a Seemingly Unrelated Regression (SUR) system. This allows to take into account unobserved common factors that contemporaneously affect all the units of the panel and provides, at the same time, unit-specific test statistics. Moreover, the approach is particularly suited when the number of individuals of the panel is small relatively to the number of time series observations. I develop the algorithm to implement the test and I use Monte Carlo simulation to analyze the properties of the test. The small sample properties of the test are remarkable, compared to its single equation counterpart. I illustrate the use of the test through a test of Purchasing Power Parity in a panel of EU15 countries. In the second chapter of my PhD thesis, I verify the Expectation Hypothesis of the Term Structure in the repurchasing agreements (repo) market with a new testing approach. I consider an "inexact" formulation of the EHTS, which models a time-varying component in the risk premia and I treat the interest rates as a non-stationary cointegrated system. The effect of the heteroskedasticity is controlled by means of testing procedures (bootstrap and heteroskedasticity correction) which are robust to variance and covariance shifts over time. I fi#nd that the long-run implications of EHTS are verified. A rolling window analysis clarifies that the EHTS is only rejected in periods of turbulence of #financial markets. The third chapter introduces the Stata command "bootrank" which implements the bootstrap likelihood ratio rank test algorithm developed by Cavaliere et al. (2012). The command is illustrated through an empirical application on the term structure of interest rates in the US.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document-intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www.consort-statement.org) should be helpful resources to improve reporting of randomised trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses methodological issues in the field of tooth wear and erosion research including the epidemiological indices, and identifies future work that is needed to improve knowledge about tooth wear and erosion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Systematic reviews are not an assembly of anecdotes but a distillation of current best available evidence on a particular topic and as such have an important role to play in evidence-based healthcare. A substantial proportion of these systematic reviews focus on interventions, and are able to provide clinicians with the opportunity to understand and translate the best available evidence on the effects of these healthcare interventions into clinical practice. The importance of systematic reviews in summarising and identifying the gaps in evidence which might inform new research initiatives is also widely acknowledged. Their potential impact on practice and research makes their methodological quality especially important as it may directly influence their utility for clinicians, patients and policy makers. The objectives of this study were to identify systematic reviews of oral healthcare interventions published in the Journal of Applied Oral Science (JAOS) and to evaluate their methodological quality using the evaluation tool, AMSTAR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The widespread use of artificial nestboxes has led to significant advances in our knowledge of the ecology, behaviour and physiology of cavity nesting birds, especially small passerines Nestboxes have made it easier to perform routine monitoring and experimental manipulation of eggs or nestlings, and also repeatedly to capture, identify and manipulate the parents However, when comparing results across study sites the use of nestboxes may also Introduce a potentially significant confounding variable in the form of differences in nestbox design amongst studies, such as their physical dimensions, placement height, and the way in which they are constructed and maintained However, the use of nestboxes may also introduce an unconsidered and potentially significant confounding variable clue to differences in nestbox design amongst studies, such as their physical dimensions, placement height, and the way in which they are constructed and maintained Here we review to what extent the characteristics of artificial nestboxes (e g size, shape, construction material, colour) are documented in the 'methods' sections of publications involving hole-nesting passerine birds using natural or excavated cavities or artificial nestboxes for reproduction and roosting Despite explicit previous recommendations that authors describe in detail the characteristics of the nestboxes used, we found that the description of nestbox characteristics in most recent publications remains poor and insufficient We therefore list the types of descriptive data that should be included in the methods sections of relevant manuscripts and justify this by discussing how variation in nestbox characteristics can affect or confound conclusions from nestbox studies We also propose several recommendations to improve the reliability and usefulness of research based on long-term studies of any secondary hole-nesting species using artificial nestboxes for breeding or roosting.