866 resultados para Lower bounds
Resumo:
Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.
Resumo:
We investigate the use of certain data-dependent estimates of the complexity of a function class, called Rademacher and Gaussian complexities. In a decision theoretic setting, we prove general risk bounds in terms of these complexities. We consider function classes that can be expressed as combinations of functions from basis classes and show how the Rademacher and Gaussian complexities of such a function class can be bounded in terms of the complexity of the basis classes. We give examples of the application of these techniques in finding data-dependent risk bounds for decision trees, neural networks and support vector machines.
Resumo:
The purpose of this preliminary study was to determine the relevance of the categorization of the load regime data to assess the functional output and usage of the prosthesis of lower limb amputees. The objectives were a) to introduce a categorization of load regime, b) to present some descriptors of each activity, and c) to report the results for a case. The load applied on the osseointegrated fixation of one transfemoral amputee was recorded using a portable kinetic system for 5 hours. The periods of directional locomotion, localized locomotion, and stationary loading occurred 44%, 34%, and 22% of recording time and each accounted for 51%, 38%, and 12% of the duration of the periods of activity, respectively. The absolute maximum force during directional locomotion, localized locomotion, and stationary loading was 19%, 15%, and 8% of the body weight on the anteroposterior axis, 20%, 19%, and 12% on the mediolateral axis, and 121%, 106%, and 99% on the long axis. A total of 2,783 gait cycles were recorded. Approximately 10% more gait cycles and 50% more of the total impulse than conventional analyses were identified. The proposed categorization and apparatus have the potential to complement conventional instruments, particularly for difficult cases.
Resumo:
We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d = VC(F) bound on the graph density of a subgraph of the hypercube—oneinclusion graph. The first main result of this paper is a density bound of n [n−1 <=d-1]/[n <=d] < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d contractible simplicial complexes, extending the well-known characterization that d = 1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VCdimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(logn) and is shown to be optimal up to an O(logk) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout.
Resumo:
H. Simon and B. Szörényi have found an error in the proof of Theorem 52 of “Shifting: One-inclusion mistake bounds and sample compression”, Rubinstein et al. (2009). In this note we provide a corrected proof of a slightly weakened version of this theorem. Our new bound on the density of one-inclusion hypergraphs is again in terms of the capacity of the multilabel concept class. Simon and Szörényi have recently proved an alternate result in Simon and Szörényi (2009).
Resumo:
We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d=VC(F) bound on the graph density of a subgraph of the hypercube—one-inclusion graph. The first main result of this report is a density bound of n∙choose(n-1,≤d-1)/choose(n,≤d) < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d-contractible simplicial complexes, extending the well-known characterization that d=1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VC-dimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(log n) and is shown to be optimal up to a O(log k) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout
Resumo:
We present a modification of the algorithm of Dani et al. [8] for the online linear optimization problem in the bandit setting, which with high probability has regret at most O ∗ ( √ T) against an adaptive adversary. This improves on the previous algorithm [8] whose regret is bounded in expectation against an oblivious adversary. We obtain the same dependence on the dimension (n 3/2) as that exhibited by Dani et al. The results of this paper rest firmly on those of [8] and the remarkable technique of Auer et al. [2] for obtaining high probability bounds via optimistic estimates. This paper answers an open question: it eliminates the gap between the high-probability bounds obtained in the full-information vs bandit settings.
Resumo:
Secondary lower-limb lymphedema can develop following treatment for gynecological cancers, and has debilitating effects on quality of life (QoL). Lymphedema can limit mobility and ability to perform daily activities, and have adverse effects on psychological and social wellbeing. When assessing the effect of lymphedema treatment methods, the focus is on change in clinically measured lymphedema status, rather than QoL outcomes. Considering that treatment for lymphedema involves a significant and ongoing commitment from patients, it is essential to determine whether the benefits to patients outweigh the burden associated with treatment. This article summarizes the results of studies assessing the impact of lower-limb lymphedema on QoL in women with gynecological cancer, evaluates their methodologies and discusses limitations and priorities for future research.
Resumo:
This article augments Resource Dependence Theory with Real Options reasoning in order to explain time bounds specification in strategic alliances. Whereas prior work has found about a 50/50 split between alliances that are time bound and those that are open-ended, their substantive differences and antecedents are ill understood. To address this, we suggest that the two alliance modes present different real options trade-offs in adaptation to environmental uncertainty: ceteris paribus, time-bound alliances are likely to provide abandonment options over open-ended alliances, but require additional investments to extend the alliance when this turns out to be desirable after formation. Open-ended alliances are likely to provide growth options over open-ended alliances, but they demand additional effort to abandon the alliance if post-formation circumstances so desire. Therefore, we expect time bounds specification to be a function of environmental uncertainty: organizations in more uncertain environments will be relatively more likely to place time bounds on their strategic alliances. Longitudinal archival and survey data collected amongst 39 industry clusters provides empirical support for our claims, which contribute to the recent renaissance of resource dependence theory by specifying the conditions under which organizations choose different time windows in strategic partnering.
Resumo:
Finite Element Modeling (FEM) has become a vital tool in the automotive design and development processes. FEM of the human body is a technique capable of estimating parameters that are difficult to measure in experimental studies with the human body segments being modeled as complex and dynamic entities. Several studies have been dedicated to attain close-to-real FEMs of the human body (Pankoke and Siefert 2007; Amann, Huschenbeth et al. 2009; ESI 2010). The aim of this paper is to identify and appraise the state of-the art models of the human body which incorporate detailed pelvis and/or lower extremity models. Six databases and search engines were used to obtain literature, and the search was limited to studies published in English since 2000. The initial search results identified 636 pelvis-related papers, 834 buttocks-related papers, 505 thigh-related papers, 927 femur-related papers, 2039 knee-related papers, 655 shank-related papers, 292 tibia-related papers, 110 fibula-related papers, 644 ankle related papers, and 5660 foot-related papers. A refined search returned 100 pelvis-related papers, 45 buttocks related papers, 65 thigh-related papers, 162 femur-related papers, 195 kneerelated papers, 37 shank-related papers, 80 tibia-related papers, 30 fibula-related papers and 102 ankle-related papers and 246 foot-related papers. The refined literature list was further restricted by appraisal against a modified LOW appraisal criteria. Studies with unclear methodologies, with a focus on populations with pathology or with sport related dynamic motion modeling were excluded. The final literature list included fifteen models and each was assessed against the percentile the model represents, the gender the model was based on, the human body segment/segments included in the model, the sample size used to develop the model, the source of geometric/anthropometric values used to develop the model, the posture the model represents and the finite element solver used for the model. The results of this literature review provide indication of bias in the available models towards 50th percentile male modeling with a notable concentration on the pelvis, femur and buttocks segments.
Resumo:
Fusion techniques have received considerable attention for achieving lower error rates with biometrics. A fused classifier architecture based on sequential integration of multi-instance and multi-sample fusion schemes allows controlled trade-off between false alarms and false rejects. Expressions for each type of error for the fused system have previously been derived for the case of statistically independent classifier decisions. It is shown in this paper that the performance of this architecture can be improved by modelling the correlation between classifier decisions. Correlation modelling also enables better tuning of fusion model parameters, ‘N’, the number of classifiers and ‘M’, the number of attempts/samples, and facilitates the determination of error bounds for false rejects and false accepts for each specific user. Error trade-off performance of the architecture is evaluated using HMM based speaker verification on utterances of individual digits. Results show that performance is improved for the case of favourable correlated decisions. The architecture investigated here is directly applicable to speaker verification from spoken digit strings such as credit card numbers in telephone or voice over internet protocol based applications. It is also applicable to other biometric modalities such as finger prints and handwriting samples.
Resumo:
The relationship between intellectual functioning and criminal offending has received considerable focus within the literature. While there remains debate regarding the existence (and strength) of this relationship, there is a wider consensus that individuals with below average functioning (in particular cognitive impairments) are disproportionately represented within the prison population. This paper focuses on research that has implications for the effective management of lower functioning individuals within correctional environments as well as the successful rehabilitation and release of such individuals back into the community. This includes a review of the literature regarding the link between lower intelligence and offending and the identification of possible factors that either facilitate (or confound) this relationship. The main themes to emerge from this review are that individuals with lower intellectual functioning continue to be disproportionately represented in custodial settings and that there is a need to increase the provision of specialised programs to cater for their needs. Further research is also needed into a range of areas including: (1) the reason for this over-representation in custodial settings, (2) the existence and effectiveness of rehabilitation and release programs that cater for lower IQ offenders, (3) the effectiveness of custodial alternatives for this group (e.g. intensive corrections orders) and (4) what post-custodial release services are needed to reduce the risk of recidivism.