978 resultados para Loi de puissance inverse
Resumo:
A large scale Chinese agricultural survey was conducted at the direction of John Lossing Buck from 1929 through 1933. At the end of the 1990’s, some parts of the original micro data of Buck’s survey were discovered at Nanjing Agricultural University. An international joint study was begun to restore micro data of Buck’s survey and construct parts of the micro database on both the crop yield survey and special expenditure survey. This paper includes a summary of the characteristics of farmlands and cropping patterns in crop yield micro data that covered 2,102 farmers in 20 counties of 9 provinces. In order to test the classical hypothesis of whether or not an inverse relationship between land productivity and cultivated area may be observed in developing countries, a Box-Cox transformation test was conducted for functional forms on five main crops of Buck’s crop yield survey. The result of the test shows that the relationship between land productivity and cultivated areas of wheat and barley is linear and somewhat negative; those of rice, rapeseed, and seed cotton appear to be slightly positive. It can be tentatively concluded that the relationship between cultivated area and land productivity are not the same among crops, and the difference of labor intensity and the level of commercialization of each crop may be strongly related to the existence or non-existence of inverse relationships.
Resumo:
The optimum quality that can be asymptotically achieved in the estimation of a probability p using inverse binomial sampling is addressed. A general definition of quality is used in terms of the risk associated with a loss function that satisfies certain assumptions. It is shown that the limit superior of the risk for p asymptotically small has a minimum over all (possibly randomized) estimators. This minimum is achieved by certain non-randomized estimators. The model includes commonly used quality criteria as particular cases. Applications to the non-asymptotic regime are discussed considering specific loss functions, for which minimax estimators are derived.
Resumo:
The solubility parameters of two SBS commercial rubbers with different structures (lineal and radial), and with slightly different styrene content have been determined by inverse gas chromatography technique. The Flory–Huggins interaction parameters of several polymer–solvent mixtures have also been calculated. The influence of the polymer composition, the solvent molecular weight and the temperature over these parameters have been discussed; besides, these parameters have been compared with previous ones, obtained by intrinsic viscosity measurements. From the Flory–Huggins interaction parameters, the infinite dilution activity coefficients of the solvents have been calculated and fitted to the well-known NRTL model. These NRTL binary interaction parameters have a great importance in modelling the separation steps in the process of obtaining the rubber.
Resumo:
Objective: This research is focused in the creation and validation of a solution to the inverse kinematics problem for a 6 degrees of freedom human upper limb. This system is intended to work within a realtime dysfunctional motion prediction system that allows anticipatory actuation in physical Neurorehabilitation under the assisted-as-needed paradigm. For this purpose, a multilayer perceptron-based and an ANFIS-based solution to the inverse kinematics problem are evaluated. Materials and methods: Both the multilayer perceptron-based and the ANFIS-based inverse kinematics methods have been trained with three-dimensional Cartesian positions corresponding to the end-effector of healthy human upper limbs that execute two different activities of the daily life: "serving water from a jar" and "picking up a bottle". Validation of the proposed methodologies has been performed by a 10 fold cross-validation procedure. Results: Once trained, the systems are able to map 3D positions of the end-effector to the corresponding healthy biomechanical configurations. A high mean correlation coefficient and a low root mean squared error have been found for both the multilayer perceptron and ANFIS-based methods. Conclusions: The obtained results indicate that both systems effectively solve the inverse kinematics problem, but, due to its low computational load, crucial in real-time applications, along with its high performance, a multilayer perceptron-based solution, consisting in 3 input neurons, 1 hidden layer with 3 neurons and 6 output neurons has been considered the most appropriated for the target application.
Resumo:
Sequential estimation of the success probability p in inverse binomial sampling is considered in this paper. For any estimator pˆ , its quality is measured by the risk associated with normalized loss functions of linear-linear or inverse-linear form. These functions are possibly asymmetric, with arbitrary slope parameters a and b for pˆ
p , respectively. Interest in these functions is motivated by their significance and potential uses, which are briefly discussed. Estimators are given for which the risk has an asymptotic value as p→0, and which guarantee that, for any p∈(0,1), the risk is lower than its asymptotic value. This allows selecting the required number of successes, r, to meet a prescribed quality irrespective of the unknown p. In addition, the proposed estimators are shown to be approximately minimax when a/b does not deviate too much from 1, and asymptotically minimax as r→∞ when a=b.
Resumo:
Inverse bremsstrahlung has been incorporated into an analytical model of the expanding corona of a laser-irradiated spherical target. Absorption decreases slowly with increasing intensity, in agreement with some numerical simulations, and contrary to estimates from simple models in use up to now, which are optimistic at low values of intensity and very pessimistic at high values. Present results agree well with experimental data from many laboratories; substantial absorption is found up to moderate intensities,say below IOl5 W cm-2 for 1.06 pm light. Anomalous absorption, wher, included in the analysis, leaves practically unaffected the ablation pressure and mass ablation rate, for given absorbed intensity. Universal results are given in dimensionless fom.
Resumo:
Sequential estimation of the success probability $p$ in inverse binomial sampling is considered in this paper. For any estimator $\hatvap$, its quality is measured by the risk associated with normalized loss functions of linear-linear or inverse-linear form. These functions are possibly asymmetric, with arbitrary slope parameters $a$ and $b$ for $\hatvap < p$ and $\hatvap > p$ respectively. Interest in these functions is motivated by their significance and potential uses, which are briefly discussed. Estimators are given for which the risk has an asymptotic value as $p \rightarrow 0$, and which guarantee that, for any $p \in (0,1)$, the risk is lower than its asymptotic value. This allows selecting the required number of successes, $\nnum$, to meet a prescribed quality irrespective of the unknown $p$. In addition, the proposed estimators are shown to be approximately minimax when $a/b$ does not deviate too much from $1$, and asymptotically minimax as $\nnum \rightarrow \infty$ when $a=b$.
Resumo:
There is general agreement within the scientific community in considering Biology as the science with more potential to develop in the XXI century. This is due to several reasons, but probably the most important one is the state of development of the rest of experimental and technological sciences. In this context, there are a very rich variety of mathematical tools, physical techniques and computer resources that permit to do biological experiments that were unbelievable only a few years ago. Biology is nowadays taking advantage of all these newly developed technologies, which are been applied to life sciences opening new research fields and helping to give new insights in many biological problems. Consequently, biologists have improved a lot their knowledge in many key areas as human function and human diseases. However there is one human organ that is still barely understood compared with the rest: The human brain. The understanding of the human brain is one of the main challenges of the XXI century. In this regard, it is considered a strategic research field for the European Union and the USA. Thus, there is a big interest in applying new experimental techniques for the study of brain function. Magnetoencephalography (MEG) is one of these novel techniques that are currently applied for mapping the brain activity1. This technique has important advantages compared to the metabolic-based brain imagining techniques like Functional Magneto Resonance Imaging2 (fMRI). The main advantage is that MEG has a higher time resolution than fMRI. Another benefit of MEG is that it is a patient friendly clinical technique. The measure is performed with a wireless set up and the patient is not exposed to any radiation. Although MEG is widely applied in clinical studies, there are still open issues regarding data analysis. The present work deals with the solution of the inverse problem in MEG, which is the most controversial and uncertain part of the analysis process3. This question is addressed using several variations of a new solving algorithm based in a heuristic method. The performance of those methods is analyzed by applying them to several test cases with known solutions and comparing those solutions with the ones provided by our methods.
Resumo:
In this paper we propose a novel fast random search clustering (RSC) algorithm for mixing matrix identification in multiple input multiple output (MIMO) linear blind inverse problems with sparse inputs. The proposed approach is based on the clustering of the observations around the directions given by the columns of the mixing matrix that occurs typically for sparse inputs. Exploiting this fact, the RSC algorithm proceeds by parameterizing the mixing matrix using hyperspherical coordinates, randomly selecting candidate basis vectors (i.e. clustering directions) from the observations, and accepting or rejecting them according to a binary hypothesis test based on the Neyman–Pearson criterion. The RSC algorithm is not tailored to any specific distribution for the sources, can deal with an arbitrary number of inputs and outputs (thus solving the difficult under-determined problem), and is applicable to both instantaneous and convolutive mixtures. Extensive simulations for synthetic and real data with different number of inputs and outputs, data size, sparsity factors of the inputs and signal to noise ratios confirm the good performance of the proposed approach under moderate/high signal to noise ratios. RESUMEN. Método de separación ciega de fuentes para señales dispersas basado en la identificación de la matriz de mezcla mediante técnicas de "clustering" aleatorio.
Resumo:
We give necessary and sufficient conditions for the convergence with geometric rate of the common denominators of simultaneous rational interpolants with a bounded number of poles. The conditions are expressed in terms of intrinsic properties of the system of functions used to build the approximants. Exact rates of convergence for these denominators and the simultaneous rational approximants are provided.