992 resultados para multi-frequency
Resumo:
Introduction: We launched an investigator-initiated study (ISRCTN31181395) to evaluate the potential benefit of pharmacokinetic-guided dosage individualization of imatinib for leukaemiapatients followed in public and private sectors. Following approval by the research ethics committee (REC) of the coordinating centre, recruitment throughout Switzerland necessitatedto submit the protocol to 11 cantonal RECs.Materials and Methods: We analysed requirements and evaluation procedures of the 12 RECs with associated costs.Results: 1-18 copies of the dossier, in total 4300 printed pages, were required (printing/posting costs: ~300 CHF) to meet initial requirements. Meeting frequencies of RECs ranged between 2 weeks and 2 months, time from submission to fi rst feedback took 2-75 days. Study approval was obtained from a chairman, a subor the full committee, the evaluation work being invoiced by0-1000 CHF (median: 750 CHF, total: 9200 CHF). While 5 RECs gave immediate approval, the other 6 rose in total 38 queries before study release, mainly related to wording in the patient information, leading to 7 different fi nal versions approved. Submission tasks employed an investigator half-time over about 6 months.Conclusion: While the necessity of clinical research evaluation by independent RECs is undisputed, there is a need of further harmonization and cooperation in evaluation procedures. Current administrative burden is indeed complex, time-consuming and costly. A harmonized electronic application form, preferably compatible with other regulatory bodies and European countries, could increase transparency, improve communication, and encourage academic multi-centre clinical research in Switzerland.
Resumo:
Background: Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study.Methodology/Principal Findings: We built up two prediction rules ("Snap-shot rule" for a single sample and "Track-shot rule" for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior >= 5% or < 5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200x10(6)/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold.Conclusions/Significance: Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count > 650 for a threshold of 200, > 900 for 350, or > 1150 for 500x10(6)/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.
Resumo:
The human brain displays heterogeneous organization in both structure and function. Here we develop a method to characterize brain regions and networks in terms of information-theoretic measures. We look at how these measures scale when larger spatial regions as well as larger connectome sub-networks are considered. This framework is applied to human brain fMRI recordings of resting-state activity and DSI-inferred structural connectivity. We find that strong functional coupling across large spatial distances distinguishes functional hubs from unimodal low-level areas, and that this long-range functional coupling correlates with structural long-range efficiency on the connectome. We also find a set of connectome regions that are both internally integrated and coupled to the rest of the brain, and which resemble previously reported resting-state networks. Finally, we argue that information-theoretic measures are useful for characterizing the functional organization of the brain at multiple scales.
Resumo:
Rockfall hazard zoning is usually achieved using a qualitative estimate of hazard, and not an absolute scale. In Switzerland, danger maps, which correspond to a hazard zoning depending on the intensity of the considered phenomenon (e.g. kinetic energy for rockfalls), are replacing hazard maps. Basically, the danger grows with the mean frequency and with the intensity of the rockfall. This principle based on intensity thresholds may also be applied to other intensity threshold values than those used in Switzerland for rockfall hazard zoning method, i.e. danger mapping. In this paper, we explore the effect of slope geometry and rockfall frequency on the rockfall hazard zoning. First, the transition from 2D zoning to 3D zoning based on rockfall trajectory simulation is examined; then, its dependency on slope geometry is emphasized. The spatial extent of hazard zones is examined, showing that limits may vary widely depending on the rockfall frequency. This approach is especially dedicated to highly populated regions, because the hazard zoning has to be very fine in order to delineate the greatest possible territory containing acceptable risks.
Resumo:
Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.
Resumo:
In moment structure analysis with nonnormal data, asymptotic valid inferences require the computation of a consistent (under general distributional assumptions) estimate of the matrix $\Gamma$ of asymptotic variances of sample second--order moments. Such a consistent estimate involves the fourth--order sample moments of the data. In practice, the use of fourth--order moments leads to computational burden and lack of robustness against small samples. In this paper we show that, under certain assumptions, correct asymptotic inferences can be attained when $\Gamma$ is replaced by a matrix $\Omega$ that involves only the second--order moments of the data. The present paper extends to the context of multi--sample analysis of second--order moment structures, results derived in the context of (simple--sample) covariance structure analysis (Satorra and Bentler, 1990). The results apply to a variety of estimation methods and general type of statistics. An example involving a test of equality of means under covariance restrictions illustrates theoretical aspects of the paper.
Resumo:
Working Paper no longer available. Please contact the author.
Resumo:
We extend to score, Wald and difference test statistics the scaled and adjusted corrections to goodness-of-fit test statistics developed in Satorra and Bentler (1988a,b). The theory is framed in the general context of multisample analysis of moment structures, under general conditions on the distribution of observable variables. Computational issues, as well as the relation of the scaled and corrected statistics to the asymptotic robust ones, is discussed. A Monte Carlo study illustrates thecomparative performance in finite samples of corrected score test statistics.
Resumo:
The HACEK organisms (Haemophilus species, Aggregatibacter species, Cardiobacterium hominis, Eikenella corrodens, and Kingella species) are rare causes of infective endocarditis (IE). The objective of this study is to describe the clinical characteristics and outcomes of patients with HACEK endocarditis (HE) in a large multi-national cohort. Patients hospitalized with definite or possible infective endocarditis by the International Collaboration on Endocarditis Prospective Cohort Study in 64 hospitals from 28 countries were included and characteristics of HE patients compared with IE due to other pathogens. Of 5591 patients enrolled, 77 (1.4%) had HE. HE was associated with a younger age (47 vs. 61 years; p<0.001), a higher prevalence of immunologic/vascular manifestations (32% vs. 20%; p<0.008) and stroke (25% vs. 17% p = 0.05) but a lower prevalence of congestive heart failure (15% vs. 30%; p = 0.004), death in-hospital (4% vs. 18%; p = 0.001) or after 1 year follow-up (6% vs. 20%; p = 0.01) than IE due to other pathogens (n = 5514). On multivariable analysis, stroke was associated with mitral valve vegetations (OR 3.60; CI 1.34-9.65; p<0.01) and younger age (OR 0.62; CI 0.49-0.90; p<0.01). The overall outcome of HE was excellent with the in-hospital mortality (4%) significantly better than for non-HE (18%; p<0.001). Prosthetic valve endocarditis was more common in HE (35%) than non-HE (24%). The outcome of prosthetic valve and native valve HE was excellent whether treated medically or with surgery. Current treatment is very successful for the management of both native valve prosthetic valve HE but further studies are needed to determine why HE has a predilection for younger people and to cause stroke. The small number of patients and observational design limit inferences on treatment strategies. Self selection of study sites limits epidemiological inferences.
Resumo:
BACKGROUND AND PURPOSE: Multi-phase postmortem CT angiography (MPMCTA) is increasingly being recognized as a valuable adjunct medicolegal tool to explore the vascular system. Adequate interpretation, however, requires knowledge about the most common technique-related artefacts. The purpose of this study was to identify and index the possible artefacts related to MPMCTA. MATERIAL AND METHODS: An experienced radiologist blinded to all clinical and forensic data retrospectively reviewed 49 MPMCTAs. Each angiographic phase, i.e. arterial, venous and dynamic, was analysed separately to identify phase-specific artefacts based on location and aspect. RESULTS: Incomplete contrast filling of the cerebral venous system was the most commonly encountered artefact, followed by contrast agent layering in the lumen of the thoracic aorta. Enhancement or so-called oedematization of the digestive system mucosa was also frequently observed. CONCLUSION: All MPMCTA artefacts observed and described here are reproducible and easily identifiable. Knowledge about these artefacts is important to avoid misinterpreting them as pathological findings.
Resumo:
A total of 357 house mice (Mus domesticus) from 83 localities uniformly distributed throughout Switzerland were screened for the presence of a homogenously staining region (HSR) on chromosome 1. Altogether 47 mice from 11 localities were HSR/+ or HSR/HSR. One sample of 11 individuals all had an HSR/HSR karyotype. Almost all mice with the variant were collected from the Rhone valley (HSR frequency: 61%) and Val Bregaglia (HSR frequency: 81%). For samples from most of the area of Switzerland, the HSR was absent. There was no strong association between the geographic distribution of the HSR and the areas of occurrence of metacentrics. However, at Chiggiogna the HSR was found on Rb (1.3). Possible explanations for the HSR polymorphism are discussed.
Resumo:
PURPOSE: To report feasibility and potential benefits of high-frequency jet ventilation (HFJV) in tumor ablations techniques in liver, kidney, and lung lesions. METHODS: This prospective study included 51 patients (14 women, mean age 66 years) bearing 66 tumors (56 hepatic, 5 pulmonary, 5 renal tumors) with a median size of 16 ± 8.7 mm, referred for tumor ablation in an intention-to-treat fashion before preoperative anesthesiology visit. Cancellation and complications of HFJV were prospectively recorded. Anesthesia and procedure duration, as well as mean CO2 capnea, were recorded. When computed tomography guidance was used, 3D spacial coordinates of an anatomical target <2 mm in diameter on 8 slabs of 4 slices of 3.75-mm slice thickness were registered. RESULTS: HFJV was used in 41 of 51 patients. Of the ten patients who were not candidate for HFJV, two patients had contraindication to HFJV (severe COPD), three had lesions invisible under HFJV requiring deep inspiration apnea for tumor targeting, and five patients could not have HFJV because of unavailability of a trained anesthetic team. No specific complication or hypercapnia related to HFJV were observed despite a mean anesthetic duration of 2 h and ventilation performed in procubitus (n = 4) or lateral decubitus (n = 6). Measured internal target movement was 0.3 mm in x- and y-axis and below the slice thickness of 3.75 mm in the z-axis in 11 patients. CONCLUSIONS: HFJV is feasible in 80 % of patients allowing for near immobility of internal organs during liver, kidney, and lung tumor ablation.
Resumo:
Several studies have reported high performance of simple decision heuristics multi-attribute decision making. In this paper, we focus on situations where attributes are binary and analyze the performance of Deterministic-Elimination-By-Aspects (DEBA) and similar decision heuristics. We consider non-increasing weights and two probabilistic models for the attribute values: one where attribute values are independent Bernoulli randomvariables; the other one where they are binary random variables with inter-attribute positive correlations. Using these models, we show that good performance of DEBA is explained by the presence of cumulative as opposed to simple dominance. We therefore introduce the concepts of cumulative dominance compliance and fully cumulative dominance compliance and show that DEBA satisfies those properties. We derive a lower bound with which cumulative dominance compliant heuristics will choose a best alternative and show that, even with many attributes, this is not small. We also derive an upper bound for the expected loss of fully cumulative compliance heuristics and show that this is moderateeven when the number of attributes is large. Both bounds are independent of the values ofthe weights.
Resumo:
This paper describes a methodology to estimate the coefficients, to test specification hypothesesand to conduct policy exercises in multi-country VAR models with cross unit interdependencies, unit specific dynamics and time variations in the coefficients. The framework of analysis is Bayesian: a prior flexibly reduces the dimensionality of the model and puts structure on the time variations; MCMC methods are used to obtain posterior distributions; and marginal likelihoods to check the fit of various specifications. Impulse responses and conditional forecasts are obtained with the output of MCMC routine. The transmission of certain shocks across countries is analyzed.