989 resultados para Programming, Linear, utilization
Resumo:
PURPOSE: To determine the local control and complication rates for children with papillary and/or macular retinoblastoma progressing after chemotherapy and undergoing stereotactic radiotherapy (SRT) with a micromultileaf collimator. METHODS AND MATERIALS: Between 2004 and 2008, 11 children (15 eyes) with macular and/or papillary retinoblastoma were treated with SRT. The mean age was 19 months (range, 2-111). Of the 15 eyes, 7, 6, and 2 were classified as International Classification of Intraocular Retinoblastoma Group B, C, and E, respectively. The delivered dose of SRT was 50.4 Gy in 28 fractions using a dedicated micromultileaf collimator linear accelerator. RESULTS: The median follow-up was 20 months (range, 13-39). Local control was achieved in 13 eyes (87%). The actuarial 1- and 2-year local control rates were both 82%. SRT was well tolerated. Late adverse events were reported in 4 patients. Of the 4 patients, 2 had developed focal microangiopathy 20 months after SRT; 1 had developed a transient recurrence of retinal detachment; and 1 had developed bilateral cataracts. No optic neuropathy was observed. CONCLUSIONS: Linear accelerator-based SRT for papillary and/or macular retinoblastoma in children resulted in excellent tumor control rates with acceptable toxicity. Additional research regarding SRT and its intrinsic organ-at-risk sparing capability is justified in the framework of prospective trials.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
This paper introduces local distance-based generalized linear models. These models extend (weighted) distance-based linear models firstly with the generalized linear model concept, then by localizing. Distances between individuals are the only predictor information needed to fit these models. Therefore they are applicable to mixed (qualitative and quantitative) explanatory variables or when the regressor is of functional type. Models can be fitted and analysed with the R package dbstats, which implements several distancebased prediction methods.
Resumo:
L'objectiu és una aplicació que permeti realitzar el càlcul del volum de terres disponibles en el subsòl d'un àrea seleccionada. L'objectiu final del projecte serà crear un Sistema d'Informació Geogràfica SIG que ajudi a valorar quines parcel¿les de l'àrea seleccionada són les que disposen de més volum de terres per iniciar la seva explotació. Per a això, es disposa del programari gvSIG i les seves extensions (SEXTANTE) i de tota la informació que es pugui obtenir sobre els SIG, Cartografia, Geodèsia... Per dur a terme aquest projecte es necessita tenir experiència en Bases de dades, Programació Orientada a Objectes i seria recomanable tenir coneixements sobre Enginyeria del Programador. El projecte se centrarà en la utilització de gvSIG, com un exemple concret de programari SIG de lliure accés, solució desenvolupada per la ¿Conselleria d%o2019Obris Publiquis de la Generalitat Valenciana¿. Una part d'aquest projecte consistirà a avaluar aquest programari. El resultat final serà l'obtenció dels coneixements necessaris per poder treballar amb dades espacials a més d'una aplicació SIG per al càlcul del volum de terres d'un àrea seleccionada.
Resumo:
Recent advances in signal analysis have engendered EEG with the status of a true brain mapping and brain imaging method capable of providing spatio-temporal information regarding brain (dys)function. Because of the increasing interest in the temporal dynamics of brain networks, and because of the straightforward compatibility of the EEG with other brain imaging techniques, EEG is increasingly used in the neuroimaging community. However, the full capability of EEG is highly underestimated. Many combined EEG-fMRI studies use the EEG only as a spike-counter or an oscilloscope. Many cognitive and clinical EEG studies use the EEG still in its traditional way and analyze grapho-elements at certain electrodes and latencies. We here show that this way of using the EEG is not only dangerous because it leads to misinterpretations, but it is also largely ignoring the spatial aspects of the signals. In fact, EEG primarily measures the electric potential field at the scalp surface in the same way as MEG measures the magnetic field. By properly sampling and correctly analyzing this electric field, EEG can provide reliable information about the neuronal activity in the brain and the temporal dynamics of this activity in the millisecond range. This review explains some of these analysis methods and illustrates their potential in clinical and experimental applications.
Resumo:
BACKGROUND: We sought to improve upon previously published statistical modeling strategies for binary classification of dyslipidemia for general population screening purposes based on the waist-to-hip circumference ratio and body mass index anthropometric measurements. METHODS: Study subjects were participants in WHO-MONICA population-based surveys conducted in two Swiss regions. Outcome variables were based on the total serum cholesterol to high density lipoprotein cholesterol ratio. The other potential predictor variables were gender, age, current cigarette smoking, and hypertension. The models investigated were: (i) linear regression; (ii) logistic classification; (iii) regression trees; (iv) classification trees (iii and iv are collectively known as "CART"). Binary classification performance of the region-specific models was externally validated by classifying the subjects from the other region. RESULTS: Waist-to-hip circumference ratio and body mass index remained modest predictors of dyslipidemia. Correct classification rates for all models were 60-80%, with marked gender differences. Gender-specific models provided only small gains in classification. The external validations provided assurance about the stability of the models. CONCLUSIONS: There were no striking differences between either the algebraic (i, ii) vs. non-algebraic (iii, iv), or the regression (i, iii) vs. classification (ii, iv) modeling approaches. Anticipated advantages of the CART vs. simple additive linear and logistic models were less than expected in this particular application with a relatively small set of predictor variables. CART models may be more useful when considering main effects and interactions between larger sets of predictor variables.
Resumo:
Background: Many studies have found considerable variations in the resource intensity of physical therapy episodes. Although they have identified several patient-and provider-related factors, few studies have examined their relative explanatory power. We sought to quantify the contribution of patients and providers to these differences and examine how effective Swiss regulations are (nine-session ceiling per prescription and bonus for first treatments). Methods: Our sample consisted of 87,866 first physical therapy episodes performed by 3,365 physiotherapists based on referrals by 6,131 physicians. We modeled the number of visits per episode using a multilevel log linear regression with crossed random effects for physiotherapists and physicians and with fixed effects for cantons. The three-level explanatory variables were patient, physiotherapist and physician characteristics. Results: The median number of sessions was nine (interquartile range 6-13). Physical therapy use increased with age, women, higher health care costs, lower deductibles, surgery and specific conditions. Use rose with the share of nine-session episodes among physiotherapists or physicians, but fell with the share of new treatments. Geographical area had no influence. Most of the variance was explained at the patient level, but the available factors explained only 4% thereof. Physiotherapists and physicians explained only 6% and 5% respectively of the variance, although the available factors explained most of this variance. Regulations were the most powerful factors. Conclusion: Against the backdrop of abundant physical therapy supply, Swiss financial regulations did not restrict utilization. Given that patient-related factors explained most of the variance, this group should be subject to closer scrutiny. Moreover, further research is needed on the determinants of patient demand.
Resumo:
Aitchison and Bacon-Shone (1999) considered convex linear combinations ofcompositions. In other words, they investigated compositions of compositions, wherethe mixing composition follows a logistic Normal distribution (or a perturbationprocess) and the compositions being mixed follow a logistic Normal distribution. Inthis paper, I investigate the extension to situations where the mixing compositionvaries with a number of dimensions. Examples would be where the mixingproportions vary with time or distance or a combination of the two. Practicalsituations include a river where the mixing proportions vary along the river, or acrossa lake and possibly with a time trend. This is illustrated with a dataset similar to thatused in the Aitchison and Bacon-Shone paper, which looked at how pollution in aloch depended on the pollution in the three rivers that feed the loch. Here, I explicitlymodel the variation in the linear combination across the loch, assuming that the meanof the logistic Normal distribution depends on the river flows and relative distancefrom the source origins
Resumo:
This paper analyses the associations between Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI) on the prevalence of schistosomiasis and the presence of Biomphalaria glabrata in the state of Minas Gerais (MG), Brazil. Additionally, vegetation, soil and shade fraction images were created using a Linear Spectral Mixture Model (LSMM) from the blue, red and infrared channels of the Moderate Resolution Imaging Spectroradiometer spaceborne sensor and the relationship between these images and the prevalence of schistosomiasis and the presence of B. glabrata was analysed. First, we found a high correlation between the vegetation fraction image and EVI and second, a high correlation between soil fraction image and NDVI. The results also indicate that there was a positive correlation between prevalence and the vegetation fraction image (July 2002), a negative correlation between prevalence and the soil fraction image (July 2002) and a positive correlation between B. glabrata and the shade fraction image (July 2002). This paper demonstrates that the LSMM variables can be used as a substitute for the standard vegetation indices (EVI and NDVI) to determine and delimit risk areas for B. glabrata and schistosomiasis in MG, which can be used to improve the allocation of resources for disease control.
Resumo:
This paper shows the impact of the atomic capabilities concept to include control-oriented knowledge of linear control systems in the decisions making structure of physical agents. These agents operate in a real environment managing physical objects (e.g. their physical bodies) in coordinated tasks. This approach is presented using an introspective reasoning approach and control theory based on the specific tasks of passing a ball and executing the offside manoeuvre between physical agents in the robotic soccer testbed. Experimental results and conclusions are presented, emphasising the advantages of our approach that improve the multi-agent performance in cooperative systems
Resumo:
In This work we present a Web-based tool developed with the aim of reinforcing teaching and learning of introductory programming courses. This tool provides support for teaching and learning. From the teacher's perspective the system introduces important gains with respect to the classical teaching methodology. It reinforces lecture and laboratory sessions, makes it possible to give personalized attention to the student, assesses the degree of participation of the students and most importantly, performs a continuous assessment of the student's progress. From the student's perspective it provides a learning framework, consisting in a help environment and a correction environment, which facilitates their personal work. With this tool students are more motivated to do programming
Resumo:
Sediment composition is mainly controlled by the nature of the source rock(s), and chemical (weathering) and physical processes (mechanical crushing, abrasion, hydrodynamic sorting) during alteration and transport. Although the factors controlling these processes are conceptually well understood, detailed quantification of compositional changes induced by a single process are rare, as are examples where the effects of several processes can be distinguished. The present study was designed to characterize the role of mechanical crushing and sorting in the absence of chemical weathering. Twenty sediment samples were taken from Alpine glaciers that erode almost pure granitoid lithologies. For each sample, 11 grain-size fractions from granules to clay (ø grades &-1 to &9) were separated, and each fraction was analysed for its chemical composition.The presence of clear steps in the box-plots of all parts (in adequate ilr and clr scales) against ø is assumed to be explained by typical crystal size ranges for the relevant mineral phases. These scatter plots and the biplot suggest a splitting of the full grain size range into three groups: coarser than ø=4 (comparatively rich in SiO2, Na2O, K2O, Al2O3, and dominated by “felsic” minerals like quartz and feldspar), finer than ø=8 (comparatively rich in TiO2, MnO, MgO, Fe2O3, mostly related to “mafic” sheet silicates like biotite and chlorite), and intermediate grains sizes (4≤ø &8; comparatively rich in P2O5 and CaO, related to apatite, some feldspar).To further test the absence of chemical weathering, the observed compositions were regressed against three explanatory variables: a trend on grain size in ø scale, a step function for ø≥4, and another for ø≥8. The original hypothesis was that the trend could be identified with weathering effects, whereas each step function would highlight those minerals with biggest characteristic size at its lower end. Results suggest that this assumption is reasonable for the step function, but that besides weathering some other factors (different mechanical behavior of minerals) have also an important contribution to the trend.Key words: sediment, geochemistry, grain size, regression, step function