884 resultados para Minimal-complexity classifier


Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a method for determining the minimal length of elements in the generalized Thompson's groups F(p). We compute the length of an element by constructing a tree pair diagram for the element, classifying the nodes of the tree and summing associated weights from the pairs of node classifications. We use this method to effectively find minimal length representatives of an element.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Operative treatment of coronoid fracture often requires a large dissection of soft tissue, resulting in elbow stiffness and functional limitation. The authors present a minimal invasive, safe technique, useful in the case of isolated coronoid fracture associated with elbow dislocation. This technique does not require soft tissue dissection and allows an early unlimited resumption of sports activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The classical Lojasiewicz inequality and its extensions for partial differential equation problems (Simon) and to o-minimal structures (Kurdyka) have a considerable impact on the analysis of gradient-like methods and related problems: minimization methods, complexity theory, asymptotic analysis of dissipative partial differential equations, tame geometry. This paper provides alternative characterizations of this type of inequalities for nonsmooth lower semicontinuous functions defined on a metric or a real Hilbert space. In a metric context, we show that a generalized form of the Lojasiewicz inequality (hereby called the Kurdyka- Lojasiewicz inequality) relates to metric regularity and to the Lipschitz continuity of the sublevel mapping, yielding applications to discrete methods (strong convergence of the proximal algorithm). In a Hilbert setting we further establish that asymptotic properties of the semiflow generated by -∂f are strongly linked to this inequality. This is done by introducing the notion of a piecewise subgradient curve: such curves have uniformly bounded lengths if and only if the Kurdyka- Lojasiewicz inequality is satisfied. Further characterizations in terms of talweg lines -a concept linked to the location of the less steepest points at the level sets of f- and integrability conditions are given. In the convex case these results are significantly reinforced, allowing in particular to establish the asymptotic equivalence of discrete gradient methods and continuous gradient curves. On the other hand, a counterexample of a convex C2 function in R2 is constructed to illustrate the fact that, contrary to our intuition, and unless a specific growth condition is satisfied, convex functions may fail to fulfill the Kurdyka- Lojasiewicz inequality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuroblastoma (NB) is a neural crest-derived childhood tumor characterized by a remarkable phenotypic diversity, ranging from spontaneous regression to fatal metastatic disease. Although the cancer stem cell (CSC) model provides a trail to characterize the cells responsible for tumor onset, the NB tumor-initiating cell (TIC) has not been identified. In this study, the relevance of the CSC model in NB was investigated by taking advantage of typical functional stem cell characteristics. A predictive association was established between self-renewal, as assessed by serial sphere formation, and clinical aggressiveness in primary tumors. Moreover, cell subsets gradually selected during serial sphere culture harbored increased in vivo tumorigenicity, only highlighted in an orthotopic microenvironment. A microarray time course analysis of serial spheres passages from metastatic cells allowed us to specifically "profile" the NB stem cell-like phenotype and to identify CD133, ABC transporter, and WNT and NOTCH genes as spheres markers. On the basis of combined sphere markers expression, at least two distinct tumorigenic cell subpopulations were identified, also shown to preexist in primary NB. However, sphere markers-mediated cell sorting of parental tumor failed to recapitulate the TIC phenotype in the orthotopic model, highlighting the complexity of the CSC model. Our data support the NB stem-like cells as a dynamic and heterogeneous cell population strongly dependent on microenvironmental signals and add novel candidate genes as potential therapeutic targets in the control of high-risk NB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Immunotherapy, especially therapeutic vaccination, has a great deal of potential in the treatment of cancer and certain infectious diseases such as HIV (Allison et al., 2006; Fauci et al., 2008; Feldmann and Steinman, 2005). Numerous vaccine candidates have been tested in patients with a variety of tumor types and chronic viral diseases. Often, the best way to assess the clinical potential of these vaccines is to monitor the induced T cell response, and yet there are currently no standards for reporting these results. This letter is an effort to address this problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La proposta de tesi pren com a punt de partida les respostes artístiques i teòriques dutes a terme a partir dels anys seixanta contra un context de coneixement tradicional fonamentalment racionalista, que segueix la tradició lògica de la modernitat i que troba el seu reflex i aplicació social en l’ordre espaial i per extensió, en la geometria. Un cop descrites les nocions que d’aquesta modernitat han estat aplicades a l’art dels anys 50 i 60, es mostra com les crítiques de determinats filòsofs i artistes han anat conformant un corpus teòric i artístic que ha implicat un intent d’enderrocament d’aquest sistema tradicional de coneixement, interpretació, lectura i atorgament de sentit a les obres artístiques. Aquests són: M.Foucault, J.Derrida, R. Smithson, R. Serra, R. Morris, Mona Hatoum, Imi Knoebel o Tacita Dean, entre d’altres. Seguidament es presenta un anàlisi més profund i detallat d’aquelles respostes artístiques més paradigmàtiques, tant al sistema de pensament tradicional com a l’ordre espaial que aquest conseqüentment implica. Aquestes crítiques s’organitzen en dues parts antagòniques: l’una és “L’adveniment del caos”, i l’altra és la “Crítica de l’ordre”. Els artistes són: L. Bourgeois, E.Hesse, A.Mendieta i P.Halley. En una tercera part, es descriu com aquest inici deconstructor del paradigma de coneixement tradicional iniciat als anys seixanta es desenvolupa durant els següents vint anys tenint en aquest cas com a fonament teòric les crítiques de R.Krauss, J. Baudrillard, P.Virilio, i com artistes els arquitectes P. Eienmann i F. Gehri, entre d’altres. La conclusió fonamental d’aquests apartats intenta posar de manifest la subversió o infracció de la geometria com a contenidora dels conceptes de la modernitat: raó i ordre moral. Finalment, en una quarta part s’inclou el propi projecte artístic que representa l’experimentació i praxi de les conclusions teòriques d’aquesta tesi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I develop a model of endogenous bounded rationality due to search costs, arising implicitly from the problems complexity. The decision maker is not required to know the entire structure of the problem when making choices but can think ahead, through costly search, to reveal more of it. However, the costs of search are not assumed exogenously; they are inferred from revealed preferences through her choices. Thus, bounded rationality and its extent emerge endogenously: as problems become simpler or as the benefits of deeper search become larger relative to its costs, the choices more closely resemble those of a rational agent. For a fixed decision problem, the costs of search will vary across agents. For a given decision maker, they will vary across problems. The model explains, therefore, why the disparity, between observed choices and those prescribed under rationality, varies across agents and problems. It also suggests, under reasonable assumptions, an identifying prediction: a relation between the benefits of deeper search and the depth of the search. As long as calibration of the search costs is possible, this can be tested on any agent-problem pair. My approach provides a common framework for depicting the underlying limitations that force departures from rationality in different and unrelated decision-making situations. Specifically, I show that it is consistent with violations of timing independence in temporal framing problems, dynamic inconsistency and diversification bias in sequential versus simultaneous choice problems, and with plausible but contrasting risk attitudes across small- and large-stakes gambles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To document biopsychosocial profiles of patients with rheumatoid arthritis (RA) by means of the INTERMED and to correlate the results with conventional methods of disease assessment and health care utilization. METHODS: Patients with RA (n = 75) were evaluated with the INTERMED, an instrument for assessing case complexity and care needs. Based on their INTERMED scores, patients were compared with regard to severity of illness, functional status, and health care utilization. RESULTS: In cluster analysis, a 2-cluster solution emerged, with about half of the patients characterized as complex. Complex patients scoring especially high in the psychosocial domain of the INTERMED were disabled significantly more often and took more psychotropic drugs. Although the 2 patient groups did not differ in severity of illness and functional status, complex patients rated their illness as more severe on subjective measures and on most items of the Medical Outcomes Study Short Form 36. Complex patients showed increased health care utilization despite a similar biologic profile. CONCLUSIONS: The INTERMED identified complex patients with increased health care utilization, provided meaningful and comprehensive patient information, and proved to be easy to implement and advantageous compared with conventional methods of disease assessment. Intervention studies will have to demonstrate whether management strategies based on INTERMED profiles can improve treatment response and outcome of complex patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L’objectiu principal del projecte és el de classificar escenes de carretera en funció del contingut de les imatges per així poder fer un desglossament sobre quin tipus de situació tenim en el moment. És important que fixem els paràmetres necessaris en funció de l’escenari en què ens trobem per tal de treure el màxim rendiment possible a cada un dels algoritmes. La seva funcionalitat doncs, ha de ser la d’avís i suport davant els diferents escenaris de conducció. És a dir, el resultat final ha de contenir un algoritme o aplicació capaç de classificar les imatges d’entrada en diferents tipus amb la màxima eficiència espacial i temporal possible. L’algoritme haurà de classificar les imatges en diferents escenaris. Els algoritmes hauran de ser parametritzables i fàcilment manejables per l’usuari. L’eina utilitzada per aconseguir aquests objectius serà el MATLAB amb les toolboxs de visió i xarxes neuronals instal·lades.