835 resultados para sampling methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: The Psychiatric arm of the population-based CoLaus study (PsyCoLaus) is designed to: 1) establish the prevalence of threshold and subthreshold psychiatric syndromes in the 35 to 66 year-old population of the city of Lausanne (Switzerland); 2) test the validity of postulated definitions for subthreshold mood and anxiety syndromes; 3) determine the associations between psychiatric disorders, personality traits and cardiovascular diseases (CVD), 4) identify genetic variants that can modify the risk for psychiatric disorders and determine whether genetic risk factors are shared between psychiatric disorders and CVD. This paper presents the method as well as somatic and sociodemographic characteristics of the sample. METHODS: All 35 to 66 year-old persons previously selected for the population-based CoLaus survey on risk factors for CVD were asked to participate in a substudy assessing psychiatric conditions. This investigation included the Diagnostic Interview for Genetic Studies to elicit diagnostic criteria for threshold disorders according to DSM-IV and algorithmically defined subthreshold syndromes. Complementary information was gathered on potential risk and protective factors for psychiatric disorders, migraine and on the morbidity of first-degree family members, whereas the collection of DNA and plasma samples was part of the original somatic study (CoLaus). RESULTS: A total of 3,691 individuals completed the psychiatric evaluation (67% participation). The gender distribution of the sample did not differ significantly from that of the general population in the same age range. Although the youngest 5-year band of the cohort was underrepresented and the oldest 5-year band overrepresented, participants of PsyCoLaus and individuals who refused to participate revealed comparable scores on the General Health Questionnaire, a self-rating instrument completed at the somatic exam. CONCLUSIONS: Despite limitations resulting from the relatively low participation in the context of a comprehensive and time-consuming investigation, the PsyCoLaus study should significantly contribute to the current understanding of psychiatric disorders and comorbid somatic conditions by: 1) establishing the clinical relevance of specific psychiatric syndromes below the DSM-IV threshold; 2) determining comorbidity between risk factors for CVD and psychiatric disorders; 3) assessing genetic variants associated with common psychiatric disorders and 4) identifying DNA markers shared between CVD and psychiatric disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The proportion of surgery performed as a day case varies greatly between countries. Low rates suggest a large growth potential in many countries. Measuring the potential development of one day surgery should be grounded on a comprehensive list of eligible procedures, based on a priori criteria, independent of local practices. We propose an algorithmic method, using only routinely available hospital data to identify surgical hospitalizations that could have been performed as one day treatment. METHODS: Moving inpatient surgery to one day surgery was considered feasible if at least one surgical intervention was eligible for one day surgery and if none of the following criteria were present: intervention or affection requiring an inpatient stay, patient transferred or died, and length of stay greater than four days. The eligibility of a procedure to be treated as a day case was mainly established on three a priori criteria: surgical access (endoscopic or not), the invasiveness of the procedure and the size of the operated organ. Few overrides of these criteria occurred when procedures were associated with risk of immediate complications, slow physiological recovery or pain treatment requiring hospital infrastructure. The algorithm was applied to a random sample of one million inpatient US stays and more than 600 thousand Swiss inpatient stays, in the year 2002. RESULTS: The validity of our method was demonstrated by the few discrepancies between the a priori criteria based list of eligible procedures, and a state list used for reimbursement purposes, the low proportion of hospitalizations eligible for one day care found in the US sample (4.9 versus 19.4% in the Swiss sample), and the distribution of the elective procedures found eligible in Swiss hospitals, well supported by the literature. There were large variations of the proportion of candidates for one day surgery among elective surgical hospitalizations between Swiss hospitals (3 to 45.3%). CONCLUSION: The proposed approach allows the monitoring of the proportion of inpatient stay candidates for one day surgery. It could be used for infrastructure planning, resources negotiation and the surveillance of appropriate resource utilization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the impact of sampling theorems on the fidelity of sparse image reconstruction on the sphere. We discuss how a reduction in the number of samples required to represent all information content of a band-limited signal acts to improve the fidelity of sparse image reconstruction, through both the dimensionality and sparsity of signals. To demonstrate this result, we consider a simple inpainting problem on the sphere and consider images sparse in the magnitude of their gradient. We develop a framework for total variation inpainting on the sphere, including fast methods to render the inpainting problem computationally feasible at high resolution. Recently a new sampling theorem on the sphere was developed, reducing the required number of samples by a factor of two for equiangular sampling schemes. Through numerical simulations, we verify the enhanced fidelity of sparse image reconstruction due to the more efficient sampling of the sphere provided by the new sampling theorem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nicotine in a smoky indoor air environment can be determined using graphitized carbon black as a solid sorbent in quartz tubes. The temperature stability, high purity, and heat absorption characteristics of the sorbent, as well as the permeability of the quartz tubes to microwaves, enable the thermal desorption by means of microwaves after active sampling. Permeation and dynamic dilution procedures for the generation of nicotine in the vapor phase at low and high concentrations are used to evaluate the performances of the sampler. Tube preparation is described and the microwave desorption temperature is measured. Breakthrough volume is determined to allow sampling at 0.1-1 L/min for definite periods of time. The procedure is tested for the determination of gas and paticulate phase nicotine in sidestream smoke produced in an experimental chamber.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Switzerland, organ procurement is well organized at the national-level but transplant outcomes have not been systematically monitored so far. Therefore, a novel project, the Swiss Transplant Cohort Study (STCS), was established. The STCS is a prospective multicentre study, designed as a dynamic cohort, which enrolls all solid organ recipients at the national level. The features of the STCS are a flexible patient-case system that allows capturing all transplant scenarios and collection of patient-specific and allograft-specific data. Beyond comprehensive clinical data, specific focus is directed at psychosocial and behavioral factors, infectious disease development, and bio-banking. Between May 2008 and end of 2011, the six Swiss transplant centers recruited 1,677 patients involving 1,721 transplantations, and a total of 1,800 organs implanted in 15 different transplantation scenarios. 10 % of all patients underwent re-transplantation and 3% had a second transplantation, either in the past or during follow-up. 34% of all kidney allografts originated from living donation. Until the end of 2011 we observed 4,385 infection episodes in our patient population. The STCS showed operative capabilities to collect high-quality data and to adequately reflect the complexity of the post-transplantation process. The STCS represents a promising novel project for comparative effectiveness research in transplantation medicine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé L'eau est souvent considérée comme une substance ordinaire puisque elle est très commune dans la nature. En fait elle est la plus remarquable de toutes les substances. Sans l'eau la vie sur la terre n'existerait pas. L'eau représente le composant majeur de la cellule vivante, formant typiquement 70 à 95% de la masse cellulaire et elle fournit un environnement à d'innombrables organismes puisque elle couvre 75% de la surface de terre. L'eau est une molécule simple faite de deux atomes d'hydrogène et un atome d'oxygène. Sa petite taille semble en contradiction avec la subtilité de ses propriétés physiques et chimiques. Parmi celles-là, le fait que, au point triple, l'eau liquide est plus dense que la glace est particulièrement remarquable. Malgré son importance particulière dans les sciences de la vie, l'eau est systématiquement éliminée des spécimens biologiques examinés par la microscopie électronique. La raison en est que le haut vide du microscope électronique exige que le spécimen biologique soit solide. Pendant 50 ans la science de la microscopie électronique a adressé ce problème résultant en ce moment en des nombreuses techniques de préparation dont l'usage est courrant. Typiquement ces techniques consistent à fixer l'échantillon (chimiquement ou par congélation), remplacer son contenu d'eau par un plastique doux qui est transformé à un bloc rigide par polymérisation. Le bloc du spécimen est coupé en sections minces (d’environ 50 nm) avec un ultramicrotome à température ambiante. En général, ces techniques introduisent plusieurs artefacts, principalement dû à l'enlèvement d'eau. Afin d'éviter ces artefacts, le spécimen peut être congelé, coupé et observé à basse température. Cependant, l'eau liquide cristallise lors de la congélation, résultant en une importante détérioration. Idéalement, l'eau liquide est solidifiée dans un état vitreux. La vitrification consiste à refroidir l'eau si rapidement que les cristaux de glace n'ont pas de temps de se former. Une percée a eu lieu quand la vitrification d'eau pure a été découverte expérimentalement. Cette découverte a ouvert la voie à la cryo-microscopie des suspensions biologiques en film mince vitrifié. Nous avons travaillé pour étendre la technique aux spécimens épais. Pour ce faire les échantillons biologiques doivent être vitrifiés, cryo-coupées en sections vitreuse et observées dans une cryo-microscope électronique. Cette technique, appelée la cryo- microscopie électronique des sections vitrifiées (CEMOVIS), est maintenant considérée comme étant la meilleure façon de conserver l'ultrastructure de tissus et cellules biologiques dans un état très proche de l'état natif. Récemment, cette technique est devenue une méthode pratique fournissant des résultats excellents. Elle a cependant, des limitations importantes, la plus importante d'entre elles est certainement dû aux artefacts de la coupe. Ces artefacts sont la conséquence de la nature du matériel vitreux et le fait que les sections vitreuses ne peuvent pas flotter sur un liquide comme c'est le cas pour les sections en plastique coupées à température ambiante. Le but de ce travail a été d'améliorer notre compréhension du processus de la coupe et des artefacts de la coupe. Nous avons ainsi trouvé des conditions optimales pour minimiser ou empêcher ces artefacts. Un modèle amélioré du processus de coupe et une redéfinitions des artefacts de coupe sont proposés. Les résultats obtenus sous ces conditions sont présentés et comparés aux résultats obtenus avec les méthodes conventionnelles. Abstract Water is often considered to be an ordinary substance since it is transparent, odourless, tasteless and it is very common in nature. As a matter of fact it can be argued that it is the most remarkable of all substances. Without water life on Earth would not exist. Water is the major component of cells, typically forming 70 to 95% of cellular mass and it provides an environment for innumerable organisms to live in, since it covers 75% of Earth surface. Water is a simple molecule made of two hydrogen atoms and one oxygen atom, H2O. The small size of the molecule stands in contrast with its unique physical and chemical properties. Among those the fact that, at the triple point, liquid water is denser than ice is especially remarkable. Despite its special importance in life science, water is systematically removed from biological specimens investigated by electron microscopy. This is because the high vacuum of the electron microscope requires that the biological specimen is observed in dry conditions. For 50 years the science of electron microscopy has addressed this problem resulting in numerous preparation techniques, presently in routine use. Typically these techniques consist in fixing the sample (chemically or by freezing), replacing its water by plastic which is transformed into rigid block by polymerisation. The block is then cut into thin sections (c. 50 nm) with an ultra-microtome at room temperature. Usually, these techniques introduce several artefacts, most of them due to water removal. In order to avoid these artefacts, the specimen can be frozen, cut and observed at low temperature. However, liquid water crystallizes into ice upon freezing, thus causing severe damage. Ideally, liquid water is solidified into a vitreous state. Vitrification consists in solidifying water so rapidly that ice crystals have no time to form. A breakthrough took place when vitrification of pure water was discovered. Since this discovery, the thin film vitrification method is used with success for the observation of biological suspensions of. small particles. Our work was to extend the method to bulk biological samples that have to be vitrified, cryosectioned into vitreous sections and observed in cryo-electron microscope. This technique is called cryo-electron microscopy of vitreous sections (CEMOVIS). It is now believed to be the best way to preserve the ultrastructure of biological tissues and cells very close to the native state for electron microscopic observation. Since recently, CEMOVIS has become a practical method achieving excellent results. It has, however, some sever limitations, the most important of them certainly being due to cutting artefacts. They are the consequence of the nature of vitreous material and the fact that vitreous sections cannot be floated on a liquid as is the case for plastic sections cut at room temperature. The aim of the present work has been to improve our understanding of the cutting process and of cutting artefacts, thus finding optimal conditions to minimise or prevent these artefacts. An improved model of the cutting process and redefinitions of cutting artefacts are proposed. Results obtained with CEMOVIS under these conditions are presented and compared with results obtained with conventional methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Firms operating in a changing environment have a need for structures and practices that provide flexibility and enable rapid response to changes. Given the challenges they face in attempts to keep up with market needs, they have to continuously improve their processes and products, and develop new products to match market requirements. Success in changing markets depends on the firm's ability to convert knowledge into innovations, and consequently their internal structures and capabilities have an important role in innovation activities. According 10 the dynamic capability view of the firm, firms thus need dynamic capabilities in (he form ofassets, processes and structures that enable strategic flexibility and support entrepreneurial opportunity sensing and exploitation. Dynamic capabilities are also needed in conditions of rapid change in the operating environment, and in activities such as new product development and expansion to new markets. Despite the growing interest in these issues and the theoretical developments in the field of strategy research, there are still only very few empirical studies, and large-scale empirical studies in particular, that provide evidence that firms'dynamic capabilities are reflected in performance differences. This thesis represents an attempt to advance the research by providing empirical evidence of thelinkages between the firm's dynamic capabilities and performance in intenationalization and innovation activities. The aim is thus to increase knowledge and enhance understanding of the organizational factors that explain interfirm performance differences. The study is in two parts. The first part is the introduction and the second part comprises five research publications covering the theoretical foundations of the dynamic capability view and subsequent empirical analyses. Quantitative research methodology is used throughout. The thesis contributes to the literature in several ways. While a lot of prior research on dynamic capabilities is conceptual in nature, or conducted through case studies, this thesis introduces empirical measures for assessing the different aspects, and uses large-scale sampling to investigate the relationships between them and performance indicators. The dynamic capability view is further developed by integrating theoretical frameworks and research traditions from several disciplines. The results of the study provide support for the basic tenets of the dynamic capability view. The empirical findings demonstrate that the firm's ability to renew its knowledge base and other intangible assets, its proactive, entrepreneurial behavior, and the structures and practices that support operational flexibility arepositively related to performance indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a broad consensus among economists that technologicalchange has been a major contributor to the productivity growth and, hence, to the growth of the material welfare in western industrialized countries at least over the last century. Paradoxically, this issue has not been the focal point of theoretical economics. At the same time, we have witnessed the rise of the importance of technological issues at the strategic management level of business firms. Interestingly, the research has not accurately responded to this challenge either. The tension between the overwhelming empirical evidence of the importance of technology and its relative omission in the research offers a challenging target for a methodological endeavor. This study deals with the question of how different theories cope with technology and explain technological change. The focusis at the firm level and the analysis concentrates on metatheoretical issues, except for the last two chapters, which examine the problems of strategic management of technology. Here the aim is to build a new evolutionary-based theoreticalframework to analyze innovation processes at the firm level. The study consistsof ten chapters. Chapter 1 poses the research problem and contrasts the two basic approaches, neoclassical and evolutionary, to be analyzed. Chapter 2 introduces the methodological framework which is based on the methodology of isolation. Methodological and ontoogical commitments of the rival approaches are revealed and basic questions concerning their ways of theorizing are elaborated. Chapters 3-6 deal with the so-called substantive isolative criteria. The aim is to examine how different approaches cope with such critical issues as inherent uncertainty and complexity of innovative activities (cognitive isolations, chapter 3), theboundedness of rationality of innovating agents (behavioral isolations, chapter4), the multidimensional nature of technology (chapter 5), and governance costsrelated to technology (chapter 6). Chapters 7 and 8 put all these things together and look at the explanatory structures used by the neoclassical and evolutionary approaches in the light of substantive isolations. The last two cpahters of the study utilize the methodological framework and tools to appraise different economics-based candidates in the context of strategic management of technology. The aim is to analyze how different approaches answer the fundamental question: How can firms gain competitive advantages through innovations and how can the rents appropriated from successful innovations be sustained? The last chapter introduces a new evolutionary-based technology management framework. Also the largely omitted issues of entrepreneurship are examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coraebus undatus is the main insect pest of cork oak worldwide. The larvae tunnel in the cortical cambium filling the bark with galleries and causing the cork to break at harvest. The first objective of this study was to test the effect of purple traps in the attraction of C. undatus because this colour is attractive to other buprestid beetles. The second objective was to develop a diet in which field-collected larvae could be reared to adulthood. Pairs of purple and clear (control) sticky traps were placed in a cork oak forest in Girona, Spain in the summer of 2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to combine the advantages of the dried blood spot (DBS) sampling process with the highly sensitive and selective negative-ion chemical ionization tandem mass spectrometry (NICI-MS-MS) to analyze for recent antidepressants including fluoxetine, norfluoxetine, reboxetine, and paroxetine from micro whole blood samples (i.e., 10 microL). Before analysis, DBS samples were punched out, and antidepressants were simultaneously extracted and derivatized in a single step by use of pentafluoropropionic acid anhydride and 0.02% triethylamine in butyl chloride for 30 min at 60 degrees C under ultrasonication. Derivatives were then separated on a gas chromatograph coupled with a triple-quadrupole mass spectrometer operating in negative selected reaction monitoring mode for a total run time of 5 min. To establish the validity of the method, trueness, precision, and selectivity were determined on the basis of the guidelines of the "Société Française des Sciences et des Techniques Pharmaceutiques" (SFSTP). The assay was found to be linear in the concentration ranges 1 to 500 ng mL(-1) for fluoxetine and norfluoxetine and 20 to 500 ng mL(-1) for reboxetine and paroxetine. Despite the small sampling volume, the limit of detection was estimated at 20 pg mL(-1) for all the analytes. The stability of DBS was also evaluated at -20 degrees C, 4 degrees C, 25 degrees C, and 40 degrees C for up to 30 days. Furthermore, the method was successfully applied to a pharmacokinetic investigation performed on a healthy volunteer after oral administration of a single 40-mg dose of fluoxetine. Thus, this validated DBS method combines an extractive-derivative single step with a fast and sensitive GC-NICI-MS-MS technique. Using microliter blood samples, this procedure offers a patient-friendly tool in many biomedical fields such as checking treatment adherence, therapeutic drug monitoring, toxicological analyses, or pharmacokinetic studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geophysical tomography captures the spatial distribution of the underlying geophysical property at a relatively high resolution, but the tomographic images tend to be blurred representations of reality and generally fail to reproduce sharp interfaces. Such models may cause significant bias when taken as a basis for predictive flow and transport modeling and are unsuitable for uncertainty assessment. We present a methodology in which tomograms are used to condition multiple-point statistics (MPS) simulations. A large set of geologically reasonable facies realizations and their corresponding synthetically calculated cross-hole radar tomograms are used as a training image. The training image is scanned with a direct sampling algorithm for patterns in the conditioning tomogram, while accounting for the spatially varying resolution of the tomograms. In a post-processing step, only those conditional simulations that predicted the radar traveltimes within the expected data error levels are accepted. The methodology is demonstrated on a two-facies example featuring channels and an aquifer analog of alluvial sedimentary structures with five facies. For both cases, MPS simulations exhibit the sharp interfaces and the geological patterns found in the training image. Compared to unconditioned MPS simulations, the uncertainty in transport predictions is markedly decreased for simulations conditioned to tomograms. As an improvement to other approaches relying on classical smoothness-constrained geophysical tomography, the proposed method allows for: (1) reproduction of sharp interfaces, (2) incorporation of realistic geological constraints and (3) generation of multiple realizations that enables uncertainty assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.