30 resultados para Dependent variable problem
Resumo:
We say the endomorphism problem is solvable for an element W in a free group F if it can be decided effectively whether, given U in F, there is an endomorphism Φ of F sending W to U. This work analyzes an approach due to C. Edmunds and improved by C. Sims. Here we prove that the approach provides an efficient algorithm for solving the endomorphism problem when W is a two- generator word. We show that when W is a two-generator word this algorithm solves the problem in time polynomial in the length of U. This result gives a polynomial-time algorithm for solving, in free groups, two-variable equations in which all the variables occur on one side of the equality and all the constants on the other side.
Resumo:
In this paper we consider extensions of smooth transition autoregressive (STAR) models to situations where the threshold is a time-varying function of variables that affect the separation of regimes of the time series under consideration. Our specification is motivated by the observation that unusually high/low values for an economic variable may sometimes be best thought of in relative terms. State-dependent logistic STAR and contemporaneous-threshold STAR models are introduced and discussed. These models are also used to investigate the dynamics of U.S. short-term interest rates, where the threshold is allowed to be a function of past output growth and inflation.
Resumo:
The main result is a proof of the existence of a unique viscosity solution for Hamilton-Jacobi equation, where the hamiltonian is discontinuous with respect to variable, usually interpreted as the spatial one. Obtained generalized solution is continuous, but not necessarily differentiable.
Resumo:
The problem of stability analysis for a class of neutral systems with mixed time-varying neutral, discrete and distributed delays and nonlinear parameter perturbations is addressed. By introducing a novel Lyapunov-Krasovskii functional and combining the descriptor model transformation, the Leibniz-Newton formula, some free-weighting matrices, and a suitable change of variables, new sufficient conditions are established for the stability of the considered system, which are neutral-delay-dependent, discrete-delay-range dependent, and distributeddelay-dependent. The conditions are presented in terms of linear matrix inequalities (LMIs) and can be efficiently solved using convex programming techniques. Two numerical examples are given to illustrate the efficiency of the proposed method
Resumo:
Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them
Resumo:
In todays competitive markets, the importance of goodscheduling strategies in manufacturing companies lead to theneed of developing efficient methods to solve complexscheduling problems.In this paper, we studied two production scheduling problemswith sequence-dependent setups times. The setup times areone of the most common complications in scheduling problems,and are usually associated with cleaning operations andchanging tools and shapes in machines.The first problem considered is a single-machine schedulingwith release dates, sequence-dependent setup times anddelivery times. The performance measure is the maximumlateness.The second problem is a job-shop scheduling problem withsequence-dependent setup times where the objective is tominimize the makespan.We present several priority dispatching rules for bothproblems, followed by a study of their performance. Finally,conclusions and directions of future research are presented.
Resumo:
This paper analyses the effect of unmet formal care needs on informal caregiving hours in Spain using the two wavesof the Informal Support Survey (1994, 2004). Testing for double sample selection from formal care receipt and theemergence of unmet needs provides evidence that the omission of either variable would causes underestimation of thenumber of informal caregiving hours. After controlling for these two factors the number of hours of care increaseswith both the degree of dependency and unmet needs. More importantly, in the presence of unmet needs, the numberof informal caregiving hours increases when some formal care is received. This result refutes the substitution modeland supports complementarity or task specificity between both types of care. For a given combination of formal careand unmet needs, informal caregiving hours increased between 1994 and 2004. Finally, in the model for 2004, theselection term associated with the unmet needs equation is larger than that of the formal care equation, suggestingthat using the number of formal care recipients as a quality indicator may be confounding, if we do not complete thisinformation with other quality indicators.
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
The classical binary classification problem is investigatedwhen it is known in advance that the posterior probability function(or regression function) belongs to some class of functions. We introduceand analyze a method which effectively exploits this knowledge. The methodis based on minimizing the empirical risk over a carefully selected``skeleton'' of the class of regression functions. The skeleton is acovering of the class based on a data--dependent metric, especiallyfitted for classification. A new scale--sensitive dimension isintroduced which is more useful for the studied classification problemthan other, previously defined, dimension measures. This fact isdemonstrated by performance bounds for the skeleton estimate in termsof the new dimension.
Resumo:
We show that time-dependent couplings may lead to nontrivial scaling properties of the surface fluctuations of the asymptotic regime in nonequilibrium kinetic roughening models. Three typical situations are studied. In the case of a crossover between two different rough regimes, the time-dependent coupling may result in anomalous scaling for scales above the crossover length. In a different setting, for a crossover from a rough to either a flat or damping regime, the time-dependent crossover length may conspire to produce a rough surface, although the most relevant term tends to flatten the surface. In addition, our analysis sheds light into an existing debate in the problem of spontaneous imbibition, where time-dependent couplings naturally arise in theoretical models and experiments.
Resumo:
In this paper we address the problem of consistently constructing Langevin equations to describe fluctuations in nonlinear systems. Detailed balance severely restricts the choice of the random force, but we prove that this property, together with the macroscopic knowledge of the system, is not enough to determine all the properties of the random force. If the cause of the fluctuations is weakly coupled to the fluctuating variable, then the statistical properties of the random force can be completely specified. For variables odd under time reversal, microscopic reversibility and weak coupling impose symmetry relations on the variable-dependent Onsager coefficients. We then analyze the fluctuations in two cases: Brownian motion in position space and an asymmetric diode, for which the analysis based in the master equation approach is known. We find that, to the order of validity of the Langevin equation proposed here, the phenomenological theory is in agreement with the results predicted by more microscopic models
Resumo:
We present a heuristic method for learning error correcting output codes matrices based on a hierarchical partition of the class space that maximizes a discriminative criterion. To achieve this goal, the optimal codeword separation is sacrificed in favor of a maximum class discrimination in the partitions. The creation of the hierarchical partition set is performed using a binary tree. As a result, a compact matrix with high discrimination power is obtained. Our method is validated using the UCI database and applied to a real problem, the classification of traffic sign images.
Resumo:
Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.
Resumo:
Most structure-building organisms in rocky benthic communities are surface-dependent because their energy inputs depend mainly on the surface they expose to water. Two photosynthetic strategies, divided into calcareous and non calcareous algae, strict suspension-feeders and photosynthetic suspension feeders (e.g. hermatypic corals) are the four main strategies evolutively acquired by benthic organisms. Competition between those strategies occur in relation to productivity of the different species, in such a way that, for given environmental conditions, species with a higher growth (P/B ratio) would dominate. At a worldwide scale, littoral marine benthos can he considered to fit into the four fields defined by two main axes: the first, relates to productivity and relies atrophic and oligotrophic waters and the second is defined by the degree of environmental variability or seasonality (from high to low). Coral reefs (marine ecosystems dominated by photosynthetic suspension feeders) develop in the space of oligotrophic areas with low variability, while kelp beds (marine ecosystem dominated by large, non calcareous algae) are to be found only in eutrophic places with a high variability. The space of eutrophic waters with a low variability do not has specially adapted, high structured, benthic marine ecosystems, and in these conditions opportunistic algae and animals predominate. Finally, photophilic mediterranean benthos -devoid of kelps and without hermatypic corals- typifies the field of oligotrophic areas with high variability; in its more genuine aspect, Mediterranean benthos is represented by small algae with a high percentage of calcareous thallii. In all cases strict suspension-feeders compete successfully with photosynthetic organisms only in situations of low irradiances or very high inputs of POM. In its turn, Mediterranean rocky benthos, in spite of its relative uniformity, is geographically organized along the same axes. The Gulf of Lions and the insular bottoms (Balearic Islands, for example) would correspond to the extremes of eutrophic-high variability areas and oligotrophic-low variability areas, respectively. Irradiance, nutrient and POM concentration, and hydrodynamism are the three variables which mainly affect the distribution of the different surface-dependent strategies, and thus, these parameters are of paramount interest for understanding the trophic structure of Mediterranean benthic communities. In environments non limited by light, nutrient availability, defined as the product between nutrient -POM concentration and hydrodynamism, states the dominance of calcareous versus non calcareous algae. Calcareous algae dominate in oligotrophic waters while non-calcareous algae dominate in moderately eutrophic waters. In light-limited environments, passive suspension feeders (octocorallaria, gorgonians) become dominant species if POM availability is enhanced by a high hydrodynamism (strong currents); in waters with a low charge of POM organisms of other groups, mainly active suspension feeders, predominate (sponges, bryozoans, scleractiniarians). In any case, there always exists a very variable bathymetric zone, depending on light attenuation and nutrient-POM availability, where encrusting calcareous algae strongly compete with suspension feeders (coralligenous).
Resumo:
The General Assembly Line Balancing Problem with Setups (GALBPS) was recently defined in the literature. It adds sequence-dependent setup time considerations to the classical Simple Assembly Line Balancing Problem (SALBP) as follows: whenever a task is assigned next to another at the same workstation, a setup time must be added to compute the global workstation time, thereby providing the task sequence inside each workstation. This paper proposes over 50 priority-rule-based heuristic procedures to solve GALBPS, many of which are an improvement upon heuristic procedures published to date.