128 resultados para Polyharmonic distortion modeling. X-parameters. Test-Bench. Planar structures. PHD
Resumo:
Modeling concentration-response function became extremely popular in ecotoxicology during the last decade. Indeed, modeling allows determining the total response pattern of a given substance. However, reliable modeling is consuming in term of data, which is in contradiction with the current trend in ecotoxicology, which aims to reduce, for cost and ethical reasons, the number of data produced during an experiment. It is therefore crucial to determine experimental design in a cost-effective manner. In this paper, we propose to use the theory of locally D-optimal designs to determine the set of concentrations to be tested so that the parameters of the concentration-response function can be estimated with high precision. We illustrated this approach by determining the locally D-optimal designs to estimate the toxicity of the herbicide dinoseb on daphnids and algae. The results show that the number of concentrations to be tested is often equal to the number of parameters and often related to the their meaning, i.e. they are located close to the parameters. Furthermore, the results show that the locally D-optimal design often has the minimal number of support points and is not much sensitive to small changes in nominal values of the parameters. In order to reduce the experimental cost and the use of test organisms, especially in case of long-term studies, reliable nominal values may therefore be fixed based on prior knowledge and literature research instead of on preliminary experiments
Resumo:
In this paper we present a new method to track bonemovements in stereoscopic X-ray image series of the kneejoint. The method is based on two different X-ray imagesets: a rotational series of acquisitions of the stillsubject knee that will allow the tomographicreconstruction of the three-dimensional volume (model),and a stereoscopic image series of orthogonal projectionsas the subject performs movements. Tracking the movementsof bones throughout the stereoscopic image series meansto determine, for each frame, the best pose of everymoving element (bone) previously identified in the 3Dreconstructed model. The quality of a pose is reflectedin the similarity between its simulated projections andthe actual radiographs. We use direct Fourierreconstruction to approximate the three-dimensionalvolume of the knee joint. Then, to avoid the expensivecomputation of digitally rendered radiographs (DRR) forpose recovery, we reformulate the tracking problem in theFourier domain. Under the hypothesis of parallel X-raybeams, we use the central-slice-projection theorem toreplace the heavy 2D-to-3D registration of projections inthe signal domain by efficient slice-to-volumeregistration in the Fourier domain. Focusing onrotational movements, the translation-relevant phaseinformation can be discarded and we only consider scalarFourier amplitudes. The core of our motion trackingalgorithm can be implemented as a classical frame-wiseslice-to-volume registration task. Preliminary results onboth synthetic and real images confirm the validity ofour approach.
Resumo:
The slow-phase velocity of nystagmus is one of the most sensitive parameters of vestibular function and is currently the standard for evaluating the caloric test. However, the assessment of this parameter requires recording the response by using nystagmography. The aim of this study was to evaluate whether frequency and duration of the caloric nystagmus, as measured by using a clinical test with Frenzel glasses, could predict the result of the recorded test. The retrospective analysis of 222 caloric test results recorded by means of electronystagmography has shown a good association between the 3 parameters for unilateral weakness. The asymmetry observed in the velocity can be predicted by a combination of frequency and duration. On the other hand, no relationship was observed between the parameters for directional preponderance. These results indicate that a clinical caloric test with frequency and duration as parameters can be used to predict the unilateral weakness, which would be obtained by use of nystagmography. We propose an evaluation of the caloric test on the basis of diagrams combining the 3 response parameters.
Resumo:
The pace of on-going climate change calls for reliable plant biodiversity scenarios. Traditional dynamic vegetation models use plant functional types that are summarized to such an extent that they become meaningless for biodiversity scenarios. Hybrid dynamic vegetation models of intermediate complexity (hybrid-DVMs) have recently been developed to address this issue. These models, at the crossroads between phenomenological and process-based models, are able to involve an intermediate number of well-chosen plant functional groups (PFGs). The challenge is to build meaningful PFGs that are representative of plant biodiversity, and consistent with the parameters and processes of hybrid-DVMs. Here, we propose and test a framework based on few selected traits to define a limited number of PFGs, which are both representative of the diversity (functional and taxonomic) of the flora in the Ecrins National Park, and adapted to hybrid-DVMs. This new classification scheme, together with recent advances in vegetation modeling, constitutes a step forward for mechanistic biodiversity modeling.
Resumo:
Debris flows and related landslide processes occur in many regions all over Norway and pose a significant hazard to inhabited areas. Within the framework of the development of a national debris flows susceptibility map, we are working on a modeling approach suitable for Norway with a nationwide coverage. The discrimination of source areas is based on an index approach, which includes topographic parameters and hydrological settings. For the runout modeling, we use the Flow-R model (IGAR, University of Lausanne), which is based on combined probabilistic and energetic algorithms for the assessment of the spreading of the flow and maximum runout distances. First results for different test areas have shown that runout distances can be modeled reliably. For the selection of source areas, however, additional factors have to be considered, such as the lithological and quaternary geological setting, in order to accommodate the strong variation in debris flow activity in the different geological, geomorphological and climate regions of Norway.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
Background. Molecular tests for breast cancer (BC) risk assessment are reimbursed by health insurances in Switzerland since the beginning of year 2015. The main current role of these tests is to help oncologists to decide about the usefulness of adjuvant chemotherapy in patients with early stage endocrine-sensitive and human epidermal growth factor receptor 2 (HER2)-negative BC. These gene expression signatures aim at predicting the risk of recurrence in this subgroup. One of them (OncotypeDx/OT) also predicts distant metastases rate with or without the addition of cytotoxic chemotherapy to endocrine therapy. The clinical utility of these tests -in addition to existing so-called "clinico-pathological" prognostic and predictive criteria (e.g. stage, grade, biomarkers status)-is still debated. We report a single center one year experience of the use of one molecular test (OT) in clinical decision making. Methods. We extracted from the CHUV Breast Cancer Center data base the total number of BC cases with estrogen-receptor positive (ER+), HER2-negative early breast cancer (node negative (pN0) disease or micrometastases in up to 3 lymph nodes) operated between September 2014 and August 2015. For the cases from this group in which a molecular test had been decided by the tumor board, we collected the clinicopathologic parameters, the initial tumor board decision, and the final adjuvant systemic therapy decision. Results. A molecular test (OT) was done in 12.2% of patients with ER + HER2 negative early BC. The median age was 57.4 years and the median invasive tumor size was 1.7 cm. These patients were classified by ODX testing (Recurrence Score) into low-, intermediate-, and high risk groups, respectively in 27.2%, 63.6% and 9% of cases. Treatment recommendations changed in 18.2%, predominantly from chemotherapyendocrine therapy to endocrine treatment alone. Of 8 patients originally recommended chemotherapy, 25% were recommended endocrine treatment alone after receiving the Recurrence Score result. Conclusions. Though reimbursed by health insurances since January 2015, molecular tests are used moderately in our institution as per the decision of the multidisciplinary tumor board. It's mainly used to obtain a complementary confirmation supporting the decision of no chemotherapy. The OncotypeDx Recurrence Score results were in the intermediate group in 66% of the 9 tested cases but contributed to avoid chemotherapy in 2 patients during the last 12 months.
Resumo:
Since the inception of cardiopulmonary bypass (CPB), little progress has been made concerning the design of cardiotomy suction (CS). Because this is a major source of hemolysis, we decided to test a novel device (Smartsuction [SS]) specifically aimed at minimizing hemolysis during CPB in a clinical setting. Block randomization was carried out on a treated group (SS, n=28) and a control group (CTRL, n=26). Biochemical parameters were taken pre-, peri-, and post CPB and were compared between the two groups using the Student's t-test with statistical significance when P<0.05. No significant differences in patient demographics were observed between the two groups. Lactate dehydrogenase (LDH) and plasma free hemoglobin (PFH) pre-CPB were comparable for the CTRL and SS groups, respectively. LDH peri-CPB was 275+/-100 U/L versus 207+/-83 U/L for the CTRL and SS groups, respectively (P<0.05). PFH was 486+/-204 mg/L versus 351+/-176 mg/L for the CTRL and SS groups, respectively (P<0.05). LDH post CPB was 354+/-116 U/L versus 275+/-89 U/L for the CTRL and SS groups, respectively (P<0.05). PFH was 549+/-271 mg/L versus 460+/-254 mg/L for the CTRL and SS groups, respectively (P<0.05). Preoperative hematocrit (Hct) of 43+/-5% (CTRL) versus 37+/-5% (SS), and hemoglobin (Hb) of 141+/-16 g/L (CTRL) versus 122+/-17 g/L (SS) were significantly lower in the SS group. However, when normalized (N), the SS was capable of conserving Hct, Hb, and erythrocyte count perioperatively. Erythrocytes (N) were 59+/-5% (CTRL) versus 67+/-9% (SS); Hct (N) was 59+/-6% (CTRL) versus 68+/-9% (SS), and Hb (N) was 61+/-6% (CTRL) versus 70+/-10% (SS) (all P<0.05). This novel SS device evokes significantly lowered blood PFH and LDH values peri- and post CPB compared with the CTRL blood using a CS system. The SS may be a valuable alternative compared to traditional CS techniques.
Resumo:
OBJECTIVE: The study tests the hypothesis that intramodal visual binding is disturbed in schizophrenia and should be detectable in all illness stages as a stable trait marker. METHOD: Three groups of patients (rehospitalized chronic schizophrenic, first admitted schizophrenic and schizotypal patients believed to be suffering from a pre-schizophrenic prodrome) and a group of normal control subjects were tested on three tasks targeting visual 'binding' abilities (Muller-Lyer's illusion and two figure detection tasks) in addition to control parameters such as reaction time, visual selective attention, Raven's test and two conventional cortical tasks of spatial working memory (SWM) and a global local test. RESULTS: Chronic patients had a decreased performance on the binding tests. Unexpectedly, the prodromal group exhibited an enhanced Gestalt extraction on these tests compared both to schizophrenic patients and to healthy subjects. Furthermore, chronic schizophrenia was associated with a poor performance on cortical tests of SWM, global local and on Raven. This association appears to be mediated by or linked to the chronicity of the illness. CONCLUSION: The study confirms a variety of neurocognitive deficits in schizophrenia which, however, in this sample seem to be linked to chronicity of illness. However, certain aspects of visual processing concerned with Gestalt extraction deserve attention as potential vulnerability- or prodrome- indicators. The initial hypothesis of the study is rejected.
Resumo:
Time-lapse crosshole ground-penetrating radar (GPR) data, collected while infiltration occurs, can provide valuable information regarding the hydraulic properties of the unsaturated zone. In particular, the stochastic inversion of such data provides estimates of parameter uncertainties, which are necessary for hydrological prediction and decision making. Here, we investigate the effect of different infiltration conditions on the stochastic inversion of time-lapse, zero-offset-profile, GPR data. Inversions are performed using a Bayesian Markov-chain-Monte-Carlo methodology. Our results clearly indicate that considering data collected during a forced infiltration test helps to better refine soil hydraulic properties compared to data collected under natural infiltration conditions
Resumo:
A recent study with 69 Japanese liver transplants treated with tacrolimus found that the MDR13435 C >T polymorphism, but not the MDR12677 G >T polymorphism, was associated with differences in the intestinal expression level of CYP3A4 mRNA. In the present study, over 6 h, we measured the kinetics of a 75 microg oral dose of midazolam, a CYP3A substrate, in 21 healthy subjects genotyped for the MDR13435 C >T and 2677 G >T polymorphism. No statistically significant differences were found in the calculated pharmacokinetic parameters between the three 3435 C >T genotypes (TT, CT and CC group, respectively: Cmax (mean +/- SD: 0.30 +/- 0.08 ng/ml, 0.31 +/- 0.09 ng/ml and 0.31 +/- 0.11 ng/ml; Apparent clearance: 122 +/- 29 l/h, 156 +/- 92 l/h and 111 +/- 35 l/h; t1/2: 1.9 +/- 1.1 h, 1.6 +/- 0.90 h and 1.7 +/- 0.7 h). In addition, the 30-min 1'OH midazolam to midazolam ratio, a marker of CYP3A activity, determined in 74 HIV-positive patients before the introduction of antiretroviral treatment, was not significantly different between the three 3435 C >T genotypes (mean ratio +/- SD: 3.65 +/- 2.24, 4.22 +/- 3.49 and 4.24 +/- 2.03, in the TT, CT and CC groups, respectively). Similarly, no association was found between the MDR12677 G >T polymorphism and CYP3A activity in the healthy subjects or in the HIV-positive patients. The existence of a strong association between the activity of CYP3A and MDR13435 C >T and 2677 G >T polymorphisms appears unlikely, at least in Caucasian populations and/or in the absence of specific environmental factors.
Resumo:
PURPOSE: This study investigated maximal cardiometabolic response while running in a lower body positive pressure treadmill (antigravity treadmill (AG)), which reduces body weight (BW) and impact. The AG is used in rehabilitation of injuries but could have potential for high-speed running, if workload is maximally elevated. METHODS: Fourteen trained (nine male) runners (age 27 ± 5 yr; 10-km personal best, 38.1 ± 1.1 min) completed a treadmill incremental test (CON) to measure aerobic capacity and heart rate (V˙O2max and HRmax). They completed four identical tests (48 h apart, randomized order) on the AG at BW of 100%, 95%, 90%, and 85% (AG100 to AG85). Stride length and rate were measured at peak velocities (Vpeak). RESULTS: V˙O2max (mL·kg·min) was similar across all conditions (men: CON = 66.6 (3.0), AG100 = 65.6 (3.8), AG95 = 65.0 (5.4), AG90 = 65.6 (4.5), and AG85 = 65.0 (4.8); women: CON = 63.0 (4.6), AG100 = 61.4 (4.3), AG95 = 60.7 (4.8), AG90 = 61.4 (3.3), and AG85 = 62.8 (3.9)). Similar results were found for HRmax, except for AG85 in men and AG100 and AG90 in women, which were lower than CON. Vpeak (km·h) in men was 19.7 (0.9) in CON, which was lower than every other condition: AG100 = 21.0 (1.9) (P < 0.05), AG95 = 21.4 (1.8) (P < 0.01), AG90 = 22.3 (2.1) (P < 0.01), and AG85 = 22.6 (1.6) (P < 0.001). In women, Vpeak (km·h) was similar between CON (17.8 (1.1) ) and AG100 (19.3 (1.0)) but higher at AG95 = 19.5 (0.4) (P < 0.05), AG90 = 19.5 (0.8) (P < 0.05), and AG85 = 21.2 (0.9) (P < 0.01). CONCLUSIONS: The AG can be used at maximal exercise intensities at BW of 85% to 95%, reaching faster running speeds than normally feasible. The AG could be used for overspeed running programs at the highest metabolic response levels.
Resumo:
PECUBE is a three-dimensional thermal-kinematic code capable of solving the heat production-diffusion-advection equation under a temporally varying surface boundary condition. It was initially developed to assess the effects of time-varying surface topography (relief) on low-temperature thermochronological datasets. Thermochronometric ages are predicted by tracking the time-temperature histories of rock-particles ending up at the surface and by combining these with various age-prediction models. In the decade since its inception, the PECUBE code has been under continuous development as its use became wider and addressed different tectonic-geomorphic problems. This paper describes several major recent improvements in the code, including its integration with an inverse-modeling package based on the Neighborhood Algorithm, the incorporation of fault-controlled kinematics, several different ways to address topographic and drainage change through time, the ability to predict subsurface (tunnel or borehole) data, prediction of detrital thermochronology data and a method to compare these with observations, and the coupling with landscape-evolution (or surface-process) models. Each new development is described together with one or several applications, so that the reader and potential user can clearly assess and make use of the capabilities of PECUBE. We end with describing some developments that are currently underway or should take place in the foreseeable future. (C) 2012 Elsevier B.V. All rights reserved.