954 resultados para Numerical linear algebra
Multimodel inference and multimodel averaging in empirical modeling of occupational exposure levels.
Resumo:
Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.
Resumo:
Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.
Resumo:
Numerical analyses (correspondence analysis, ascending hierarchical classification, cladistic approach) were applied to the morphological characters of the adults of the genus Phlebotomus Rondani & Berté 1840. They confirm the reliability of the classic classifications, and also redefine the taxonomic and phylogenetic position of certain taxa. Thus, Spelaeophlebotomus Theodor 1948, Idiophlebotomus Quate & Fairchild 1961 and Australophlebotomus Theodor 1948 deserve generic rank. Among the vectors of leishmaniasis, the subgenus Phlebotomus Rondani & Berté 1840 is probably ancient. The results attribute an intermediate taxonomic and phylogenetic position to the taxa Euphlebotomus Theodor 1948 and Anaphlebotomus Theodor 1948, and reveal the probable artificial nature of the latter. The comparatively large numbers of species of subgenera Paraphlebotomus Theodor 1948, Synphlebotomus Theodor 1948 and, above all, Larroussius Nitzulescu 1931 and Adlerius Nitzulescu 1931, suggest that they are relatively recent. The development of adult morphological characters, the validity of their use in taxonomy and proposals for further studies are discussed.
Resumo:
Numerical analyses (correspondence analysis, ascending hierarchical classification, and cladistics) were done with morphological characters of adult phlebotomine sand flies. The resulting classification largely confirms that of classical taxonomy for supra-specific groups from the Old World, though the positions of some groups are adjusted. The taxa Spelaeophlebotomus Theodor 1948, Idiophlebotomus Quate & Fairchild 1961, Australophlebotomus Theodor 1948 and Chinius Leng 1987 are notably distinct from other Old World groups, particularly from the genus Phlebotomus Rondani & Berté 1840. Spelaeomyia Theodor 1948 and, in particular, Parvidens Theodor & Mesghali 1964 are clearly separate from Sergentomyia França & Parrot 1920.
Resumo:
BACKGROUND: In order to facilitate and improve the use of antiretroviral therapy (ART), international recommendations are released and updated regularly. We aimed to study if adherence to the recommendations is associated with better treatment outcomes in the Swiss HIV Cohort Study (SHCS). METHODS: Initial ART regimens prescribed to participants between 1998 and 2007 were classified according to IAS-USA recommendations. Baseline characteristics of patients who received regimens in violation with these recommendations (violation ART) were compared to other patients. Multivariable logistic and linear regression analyses were performed to identify associations between violation ART and (i) virological suppression and (ii) CD4 cell count increase, after one year. RESULTS: Between 1998 and 2007, 4189 SHCS participants started 241 different ART regimens. A violation ART was started in 5% of patients. Female patients (adjusted odds ratio aOR 1.83, 95%CI 1.28-2.62), those with a high education level (aOR 1.49, 95%CI 1.07-2.06) or a high CD4 count (aOR 1.53, 95%CI 1.02-2.30) were more likely to receive violation ART. The proportion of patients with an undetectable viral load (<400 copies/mL) after one year was significantly lower with violation ART than with recommended regimens (aOR 0.54, 95% CI 0.37-0.80) whereas CD4 count increase after one year of treatment was similar in both groups. CONCLUSIONS: Although more than 240 different initial regimens were prescribed, violations of the IAS-USA recommendations were uncommon. Patients receiving these regimens were less likely to have an undetectable viral load after one year, which strengthens the validity of these recommendations.
Resumo:
Recent technological advances in remote sensing have enabled investigation of the morphodynamics and hydrodynamics of large rivers. However, measuring topography and flow in these very large rivers is time consuming and thus often constrains the spatial resolution and reach-length scales that can be monitored. Similar constraints exist for computational fluid dynamics (CFD) studies of large rivers, requiring maximization of mesh-or grid-cell dimensions and implying a reduction in the representation of bedform-roughness elements that are of the order of a model grid cell or less, even if they are represented in available topographic data. These ``subgrid'' elements must be parameterized, and this paper applies and considers the impact of roughness-length treatments that include the effect of bed roughness due to ``unmeasured'' topography. CFD predictions were found to be sensitive to the roughness-length specification. Model optimization was based on acoustic Doppler current profiler measurements and estimates of the water surface slope for a variety of roughness lengths. This proved difficult as the metrics used to assess optimal model performance diverged due to the effects of large bedforms that are not well parameterized in roughness-length treatments. However, the general spatial flow patterns are effectively predicted by the model. Changes in roughness length were shown to have a major impact upon flow routing at the channel scale. The results also indicate an absence of secondary flow circulation cells in the reached studied, and suggest simpler two-dimensional models may have great utility in the investigation of flow within large rivers. Citation: Sandbach, S. D. et al. (2012), Application of a roughness-length representation to parameterize energy loss in 3-D numerical simulations of large rivers, Water Resour. Res., 48, W12501, doi: 10.1029/2011WR011284.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la University of Groningen, Holanda, entre 2007 i 2009. La simulació directa de la turbulència (DNS) és una eina clau dins de la mecànica de fluids computacional. Per una banda permet conèixer millor la física de la turbulència i per l'altra els resultats obtinguts són claus per el desenvolupament dels models de turbulència. No obstant, el DNS no és una tècnica vàlida per a la gran majoria d'aplicacions industrials degut al elevats costos computacionals. Per tant, és necessari cert grau de modelització de la turbulència. En aquest context, s'han introduïts importants millores basades en la modelització del terme convectiu (no lineal) emprant symmetry-preserving regularizations. En tracta de modificar adequadament el terme convectiu a fi de reduir la producció d'escales més i més petites (vortex-stretching) tot mantenint tots els invariants de les equacions originals. Fins ara, aquest models s'han emprat amb èxit per nombres de Rayleigh (Ra) relativament elevats. En aquest punt, disposar de resultats DNS per a configuracions més complexes i nombres de Ra més elevats és clau. En aquest contexte, s'han dut a terme simulacions DNS en el supercomputador MareNostrum d'una Differentially Heated Cavity amb Ra=1e11 i Pr=0.71 durant el primer any dels dos que consta el projecte. A més a més, s'ha adaptat el codi a fi de poder simular el fluxe al voltant d'un cub sobre una pared amb Re=10000. Aquestes simulacions DNS són les més grans fetes fins ara per aquestes configuracions i la seva correcta modelització és un gran repte degut la complexitat dels fluxes. Aquestes noves simulacions DNS estan aportant nous coneixements a la física de la turbulència i aportant resultats indispensables per al progrés de les modelitzacións tipus symmetry-preserving regularization.
Resumo:
PURPOSE: To determine the local control and complication rates for children with papillary and/or macular retinoblastoma progressing after chemotherapy and undergoing stereotactic radiotherapy (SRT) with a micromultileaf collimator. METHODS AND MATERIALS: Between 2004 and 2008, 11 children (15 eyes) with macular and/or papillary retinoblastoma were treated with SRT. The mean age was 19 months (range, 2-111). Of the 15 eyes, 7, 6, and 2 were classified as International Classification of Intraocular Retinoblastoma Group B, C, and E, respectively. The delivered dose of SRT was 50.4 Gy in 28 fractions using a dedicated micromultileaf collimator linear accelerator. RESULTS: The median follow-up was 20 months (range, 13-39). Local control was achieved in 13 eyes (87%). The actuarial 1- and 2-year local control rates were both 82%. SRT was well tolerated. Late adverse events were reported in 4 patients. Of the 4 patients, 2 had developed focal microangiopathy 20 months after SRT; 1 had developed a transient recurrence of retinal detachment; and 1 had developed bilateral cataracts. No optic neuropathy was observed. CONCLUSIONS: Linear accelerator-based SRT for papillary and/or macular retinoblastoma in children resulted in excellent tumor control rates with acceptable toxicity. Additional research regarding SRT and its intrinsic organ-at-risk sparing capability is justified in the framework of prospective trials.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
This paper introduces local distance-based generalized linear models. These models extend (weighted) distance-based linear models firstly with the generalized linear model concept, then by localizing. Distances between individuals are the only predictor information needed to fit these models. Therefore they are applicable to mixed (qualitative and quantitative) explanatory variables or when the regressor is of functional type. Models can be fitted and analysed with the R package dbstats, which implements several distancebased prediction methods.
Resumo:
The unstable rock slope, Stampa, above the village of Flåm, Norway, shows signs of both active and postglacial gravitational deformation over an area of 11 km2. Detailed structural field mapping, annual differential Global Navigation Satellite System (GNSS) surveys, as well as geomorphic analysis of high-resolution digital elevation models based on airborne and terrestrial laser scanning indicate that slope deformation is complex and spatially variable. Numerical modeling was used to investigate the influence of former rockslide activity and to better understand the failure mechanism. Field observations, kinematic analysis and numerical modeling indicate a strong structural control of the unstable area. Based on the integration of the above analyses, we propose that the failure mechanism is dominated by (1) a toppling component, (2) subsiding bilinear wedge failure and (3) planar sliding along the foliation at the toe of the unstable slope. Using differential GNSS, 18 points were measured annually over a period of up to 6 years. Two of these points have an average yearly movement of around 10 mm/year. They are located at the frontal cliff on almost completely detached blocks with volumes smaller than 300,000 m3. Large fractures indicate deep-seated gravitational deformation of volumes reaching several 100 million m3, but the movement rates in these areas are below 2 mm/year. Two different lobes of prehistoric rock slope failures were dated with terrestrial cosmogenic nuclides. While the northern lobe gave an average age of 4,300 years BP, the southern one resulted in two different ages (2,400 and 12,000 years BP), which represent most likely multiple rockfall events. This reflects the currently observable deformation style with unstable blocks in the northern part in between Joasete and Furekamben and no distinct blocks but a high rockfall activity around Ramnanosi in the south. With a relative susceptibility analysis it is concluded that small collapses of blocks along the frontal cliff will be more frequent. Larger collapses of free-standing blocks along the cliff with volumes > 100,000 m3, thus large enough to reach the fjord, cannot be ruled out. A larger collapse involving several million m3 is presently considered of very low likelihood.
Resumo:
In this work we present numerical simulations of continuous flow left ventricle assist device implantation with the aim of comparing difference in flow rates and pressure patterns depending on the location of the anastomosis and the rotational speed of the device. Despite the fact that the descending aorta anastomosis approach is less invasive, since it does not require a sternotomy and a cardiopulmonary bypass, its benefits are still controversial. Moreover, the device rotational speed should be correctly chosen to avoid anomalous flow rates and pressure distribution in specific location of the cardiovascular tree. With the aim of assessing the differences between these two approaches and device rotational speed in terms of flow rate and pressure waveforms, we set up numerical simulations of network of one-dimensional models where we account for the presence of an outflow cannula anastomosed to different locations of the aorta. Then, we use the resulting network to compare the results of the two different cannulations for several stages of heart failure and different rotational speed of the device. The inflow boundary data for the heart and the cannulas are obtained from a lumped parameters model of the entire circulatory system with an assist device, which is validated with clinical data. The results show that ascending and descending aorta cannulations lead to similar waveforms and mean flow rate in all the considered cases. Moreover, regardless of the anastomosis region, the rotational speed of the device has an important impact on wave profiles; this effect is more pronounced at high RPM.
Resumo:
BACKGROUND: We sought to improve upon previously published statistical modeling strategies for binary classification of dyslipidemia for general population screening purposes based on the waist-to-hip circumference ratio and body mass index anthropometric measurements. METHODS: Study subjects were participants in WHO-MONICA population-based surveys conducted in two Swiss regions. Outcome variables were based on the total serum cholesterol to high density lipoprotein cholesterol ratio. The other potential predictor variables were gender, age, current cigarette smoking, and hypertension. The models investigated were: (i) linear regression; (ii) logistic classification; (iii) regression trees; (iv) classification trees (iii and iv are collectively known as "CART"). Binary classification performance of the region-specific models was externally validated by classifying the subjects from the other region. RESULTS: Waist-to-hip circumference ratio and body mass index remained modest predictors of dyslipidemia. Correct classification rates for all models were 60-80%, with marked gender differences. Gender-specific models provided only small gains in classification. The external validations provided assurance about the stability of the models. CONCLUSIONS: There were no striking differences between either the algebraic (i, ii) vs. non-algebraic (iii, iv), or the regression (i, iii) vs. classification (ii, iv) modeling approaches. Anticipated advantages of the CART vs. simple additive linear and logistic models were less than expected in this particular application with a relatively small set of predictor variables. CART models may be more useful when considering main effects and interactions between larger sets of predictor variables.