963 resultados para non-linear programming
Resumo:
Soil penetration resistance is an important property that affects root growth and elongation and water movement in the soil. Since no-till systems tend to increase organic matter in the soil, the purpose of this study was to evaluate the efficiency with which soil penetration resistance is estimated using a proposed model based on moisture content, density and organic matter content in an Oxisol containing 665, 221 and 114 g kg-1 of clay, silt and sand respectively under annual no-till cropping, located in Londrina, Paraná State, Brazil. Penetration resistance was evaluated at random locations continually from May 2008 to February 2011, using an impact penetrometer to obtain a total of 960 replications. For the measurements, soil was sampled at depths of 0 to 20 cm to determine gravimetric moisture (G), bulk density (D) and organic matter content (M). The penetration resistance curve (PR) was adjusted using two non-linear models (PR = a Db Gc and PR' = a Db Gc Md), where a, b, c and d are coefficients of the adjusted model. It was found that the model that included M was the most efficient for estimating PR, explaining 91 % of PR variability, compared to 82 % of the other model.
Resumo:
Microstructure imaging from diffusion magnetic resonance (MR) data represents an invaluable tool to study non-invasively the morphology of tissues and to provide a biological insight into their microstructural organization. In recent years, a variety of biophysical models have been proposed to associate particular patterns observed in the measured signal with specific microstructural properties of the neuronal tissue, such as axon diameter and fiber density. Despite very appealing results showing that the estimated microstructure indices agree very well with histological examinations, existing techniques require computationally very expensive non-linear procedures to fit the models to the data which, in practice, demand the use of powerful computer clusters for large-scale applications. In this work, we present a general framework for Accelerated Microstructure Imaging via Convex Optimization (AMICO) and show how to re-formulate this class of techniques as convenient linear systems which, then, can be efficiently solved using very fast algorithms. We demonstrate this linearization of the fitting problem for two specific models, i.e. ActiveAx and NODDI, providing a very attractive alternative for parameter estimation in those techniques; however, the AMICO framework is general and flexible enough to work also for the wider space of microstructure imaging methods. Results demonstrate that AMICO represents an effective means to accelerate the fit of existing techniques drastically (up to four orders of magnitude faster) while preserving accuracy and precision in the estimated model parameters (correlation above 0.9). We believe that the availability of such ultrafast algorithms will help to accelerate the spread of microstructure imaging to larger cohorts of patients and to study a wider spectrum of neurological disorders.
Resumo:
It has been suggested that pathological gamblers develop illusory perceptions of control regarding the outcome of the games and should express higher Internal and Chance locus of control. A sample of 48 outpatients diagnosed with pathological gambling disorder who participated in this ex post facto study, completed the Internality, Powerful Others, and Chance scale, the South Oaks Gambling Screen questionnaire, and the Beck Depression Inventory. Results for the locus of control measure were compared with a reference group. Pathological gamblers scored higher than the reference group on the Chance locus of control, which increased with the severity of cases. Moreover, Internal locus of control did show a curvilinear relationship with the severity of cases. Pathological gamblers have specific locus of control scores that vary in function of the severity, in a linear fashion or a non-linear fashion according to the scale. This effect might be caused by competition between "illusion of control" and the tendency to attribute adverse consequence of gambling to external causes.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
The nutritional state of the pineapple plant has a large effect on plant growth, on fruit production, and fruit quality. The aim of this study was to assess the uptake, accumulation, and export of nutrients by the irrigated 'Vitória' pineapple plant during and at the end of its development. A randomized block statistical design with four replications was used. The treatments were defined by different times of plant collection: at 270, 330, 390, 450, 510, 570, 690, 750, and 810 days after planting (DAP). The collected plants were separated into the following components: leaves, stem, roots, fruit, and slips for determination of fresh and dry matter weight at 65 ºC. After drying, the plant components were ground for characterization of the composition and content of nutrients taken up and exported by the pineapple plant. The results were subjected to analysis of variance, and non-linear regression models were fitted for the significant differences identified by the F test (p<0.01). The leaves and the stem were the plant components that showed the greatest accumulation of nutrients. For production of 72 t ha-1 of fruit, the macronutrient accumulation in the 'Vitória' pineapple exhibited the following decreasing order: K > N > S > Ca > Mg > P, which corresponded to 898, 452, 134, 129, 126, and 107 kg ha-1, respectively, of total accumulation. The export of macronutrients by the pineapple fruit was in the following decreasing order: K > N > S > Ca > P > Mg, which was equivalent to 18, 17, 11, 8, 8, and 5 %, respectively, of the total accumulated by the pineapple. The 'Vitória' pineapple plant exported 78 kg ha-1 of N, 8 kg ha-1 of P, 164 kg ha-1 of K, 14 kg ha-1 of S, 10 kg ha-1 of Ca, and 6 kg ha-1 of Mg by the fruit. The nutrient content exported by the fruits represent important components of nutrient extraction from the soil, which need to be restored, while the nutrients contained in the leaves, stems and roots can be incorporated in the soil within a program of recycling of crop residues.
Resumo:
Estimating the time since discharge of a spent cartridge or a firearm can be useful in criminal situa-tions involving firearms. The analysis of volatile gunshot residue remaining after shooting using solid-phase microextraction (SPME) followed by gas chromatography (GC) was proposed to meet this objective. However, current interpretative models suffer from several conceptual drawbacks which render them inadequate to assess the evidential value of a given measurement. This paper aims to fill this gap by proposing a logical approach based on the assessment of likelihood ratios. A probabilistic model was thus developed and applied to a hypothetical scenario where alternative hy-potheses about the discharge time of a spent cartridge found on a crime scene were forwarded. In order to estimate the parameters required to implement this solution, a non-linear regression model was proposed and applied to real published data. The proposed approach proved to be a valuable method for interpreting aging-related data.
Resumo:
When dealing with multi-angular image sequences, problems of reflectance changes due either to illumination and acquisition geometry, or to interactions with the atmosphere, naturally arise. These phenomena interplay with the scene and lead to a modification of the measured radiance: for example, according to the angle of acquisition, tall objects may be seen from top or from the side and different light scatterings may affect the surfaces. This results in shifts in the acquired radiance, that make the problem of multi-angular classification harder and might lead to catastrophic results, since surfaces with the same reflectance return significantly different signals. In this paper, rather than performing atmospheric or bi-directional reflection distribution function (BRDF) correction, a non-linear manifold learning approach is used to align data structures. This method maximizes the similarity between the different acquisitions by deforming their manifold, thus enhancing the transferability of classification models among the images of the sequence.
Resumo:
Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.
Resumo:
[cat] Una qüestió clau sobre la producció de salut relativament poc explorada es refereix a la influència dels factors socioeconòmics i mediambientals sobre el pes i l’obesitat. Aquesta problemàtica adquireix particular rellevància quan es comparen dos països Mediterranis com Itàlia i Espanya. És interessant adonar-se que l’obesitat a Espanya és 5 punts percentual més elevada al 2003 mentre que a l’any 1990 era aproximadament la mateixa en ambdós països. Aquesta article presenta una descomposició no lineal dels gaps o diferencials en taxes de sobrepès (índex de massa corporal – IMC- entre 25 i 29.9 9 kg/m2), obesitat classe 1 (IMC≥30 kg/m2) i classe 2 (IMC≥35 kg/m2) entre Espanya i Itàlia per gènere i grups d’edat. En explicar aquests gaps entre països aïllem les influències dels estils de vida, els efectes socioeconòmics i els mediambientals. Els nostres resultats indiquen que quan no es controla pels efectes mediambientals (efectes de grup o ‘peer effects’) els hàbits alimentaris i el nivell educatiu són els principals predictors del gaps totals entre països (36-52%), si bé aquests dos factors exerceixen un impacte diferenciat segons gènere i edat. Un tant paradoxalment, quan controlem pels efectes de grup aquests predictors perden la seva capacitat explicativa i els efectes de grup passen a explicar entre el 46-76% dels gaps en sobrepès i obesitat i mostren un patró creixent amb l’edat.
Resumo:
This paper presents multiple kernel learning (MKL) regression as an exploratory spatial data analysis and modelling tool. The MKL approach is introduced as an extension of support vector regression, where MKL uses dedicated kernels to divide a given task into sub-problems and to treat them separately in an effective way. It provides better interpretability to non-linear robust kernel regression at the cost of a more complex numerical optimization. In particular, we investigate the use of MKL as a tool that allows us to avoid using ad-hoc topographic indices as covariables in statistical models in complex terrains. Instead, MKL learns these relationships from the data in a non-parametric fashion. A study on data simulated from real terrain features confirms the ability of MKL to enhance the interpretability of data-driven models and to aid feature selection without degrading predictive performances. Here we examine the stability of the MKL algorithm with respect to the number of training data samples and to the presence of noise. The results of a real case study are also presented, where MKL is able to exploit a large set of terrain features computed at multiple spatial scales, when predicting mean wind speed in an Alpine region.
Resumo:
The resilient modulus (MR) input parameters in the Mechanistic-Empirical Pavement Design Guide (MEPDG) program have a significant effect on the projected pavement performance. The MEPDG program uses three different levels of inputs depending on the desired level of accuracy. The primary objective of this research was to develop a laboratory testing program utilizing the Iowa DOT servo-hydraulic machine system for evaluating typical Iowa unbound materials and to establish a database of input values for MEPDG analysis. This was achieved by carrying out a detailed laboratory testing program designed in accordance with the AASHTO T307 resilient modulus test protocol using common Iowa unbound materials. The program included laboratory tests to characterize basic physical properties of the unbound materials, specimen preparation and repeated load triaxial tests to determine the resilient modulus. The MEPDG resilient modulus input parameter library for Iowa typical unbound pavement materials was established from the repeated load triaxial MR test results. This library includes the non-linear, stress-dependent resilient modulus model coefficients values for level 1 analysis, the unbound material properties values correlated to resilient modulus for level 2 analysis, and the typical resilient modulus values for level 3 analysis. The resilient modulus input parameters library can be utilized when designing low volume roads in the absence of any basic soil testing. Based on the results of this study, the use of level 2 analysis for MEPDG resilient modulus input is recommended since the repeated load triaxial test for level 1 analysis is complicated, time consuming, expensive, and requires sophisticated equipment and skilled operators.
Resumo:
In swarm robotics, communication among the robots is essential. Inspired by biological swarms using pheromones, we propose the use of chemical compounds to realize group foraging behavior in robot swarms. We designed a fully autonomous robot, and then created a swarm using ethanol as the trail pheromone allowing the robots to communicate with one another indirectly via pheromone trails. Our group recruitment and cooperative transport algorithms provide the robots with the required swarm behavior. We conducted both simulations and experiments with real robot swarms, and analyzed the data statistically to investigate any changes caused by pheromone communication in the performance of the swarm in solving foraging recruitment and cooperative transport tasks. The results show that the robots can communicate using pheromone trails, and that the improvement due to pheromone communication may be non-linear, depending on the size of the robot swarm.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed a downscaling procedure based on a non-linear Bayesian sequential simulation approach. The basic objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity, which is available throughout the model space. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariate kernel density function. This method is then applied to the stochastic integration of low-resolution, re- gional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this downscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the downscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
Mycophenolate mofetil (MMF), an ester prodrug of the immunosuppressant mycophenolic acid (MPA), is widely used for maintenance immunosuppressive therapy and prevention of renal allograft rejection in renal transplant recipients.MPA inhibits inosine monophosphate dehydrogenase (IMPDH), an enzyme involved in the “de novo” synthesis of purine nucleotides, thus suppressing both T-cell and B-cell proliferation. MPA shows a complex pharmacokinetics with considerable interand intra- patient by between- and within patient variabilities associated to MPA exposure. Several factors may contribute to it. The pharmacokinetic modeling according to the population pharmacokinetic approach with the non-linear mixed effects models has shown to be a powerful tool to describe the relationships between MMF doses and the MPA exposures and also to identify potential predictive patients’ demographic and clinical characteristics for dose tailoring during the post-transplant immunosuppresive treatment.
Resumo:
A linear programming model is used to optimally assign highway segments to highway maintenance garages using existing facilities. The model is also used to determine possible operational savings or losses associated with four alternatives for expanding, closing and/or relocating some of the garages in a study area. The study area contains 16 highway maintenance garages and 139 highway segments. The study recommends alternative No. 3 (close Tama and Blairstown garages and relocate new garage at Jct. U.S. 30 and Iowa 21) at an annual operational savings of approximately $16,250. These operational savings, however, are only the guidelines for decisionmakers and are subject to the required assumptions of the model used and limitations of the study.