932 resultados para Higher order interior points method (HOIPM)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To analyze the outcomes of intracorneal ring segment (ICRS) implantation for the treatment of keratoconus based on preoperative visual impairment. DESIGN: Multicenter, retrospective, nonrandomized study. METHODS: A total of 611 eyes of 361 keratoconic patients were evaluated. Subjects were classified according to their preoperative corrected distance visual acuity (CDVA) into 5 different groups: grade I, CDVA of 0.90 or better; grade II, CDVA equal to or better than 0.60 and worse than 0.90; grade III, CDVA equal to or better than 0.40 and worse than 0.60; grade IV, CDVA equal to or better than 0.20 and worse than 0.40; and grade plus, CDVA worse than 0.20. Success and failure indices were defined based on visual, refractive, corneal topographic, and aberrometric data and evaluated in each group 6 months after ICRS implantation. RESULTS: Significant improvement after the procedure was observed regarding uncorrected distance visual acuity in all grades (P < .05). CDVA significantly decreased in grade I (P < .01) but significantly increased in all other grades (P < .05). A total of 37.9% of patients with preoperative CDVA 0.6 or better gained 1 or more lines of CDVA, whereas 82.8% of patients with preoperative CDVA 0.4 or worse gained 1 or more lines of CDVA (P < .01). Spherical equivalent and keratometry readings showed a significant reduction in all grades (P ≤ .02). Corneal higher-order aberrations did not change after the procedure (P ≥ .05). CONCLUSIONS: Based on preoperative visual impairment, ICRS implantation provides significantly better results in patients with a severe form of the disease. A notable loss of CDVA lines can be expected in patients with a milder form of keratoconus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports an investigation into the link between failed proofs and non-theorems. It seeks to answer the question of whether anything more can be learned from a failed proof attempt than can be discovered from a counter-example. We suggest that the branch of the proof in which failure occurs can be mapped back to the segments of code that are the culprit, helping to locate the error. This process of tracing provides finer grained isolation of the offending code fragments than is possible from the inspection of counter-examples. We also discuss ideas for how such a process could be automated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proof critics are a technology from the proof planning paradigm. They examine failed proof attempts in order to extract information which can be used to generate a patch which will allow the proof to go through. We consider the proof of the $quot;whisky problem$quot;, a challenge problem from the domain of temporal logic. The proof requires a generalisation of the original conjecture and we examine two proof critics which can be used to create this generalisation. Using these critics we believe we have produced the first automatic proofs of this challenge problem. We use this example to motivate a comparison of the two critics and propose that there is a place for specialist critics as well as powerful general critics. In particular we advocate the development of critics that do not use meta-variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe an integration of the SVC decision procedure with the HOL theorem prover. This integration was achieved using the PROSPER toolkit. The SVC decision procedure operates on rational numbers, an axiomatic theory for which was provided in HOL. The decision procedure also returns counterexamples and a framework has been devised for handling counterexamples in a HOL setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent studies show that higher order oscillatory interactions such as cross-frequency coupling are important for brain functions that are impaired in schizophrenia, including perception, attention and memory. Here we investigated the dynamics of oscillatory coupling in the hippocampus of awake rats upon NMDA receptor blockade by ketamine, a pharmacological model of schizophrenia. Ketamine (25, 50 and 75 mg/kg i.p.) increased gamma and high-frequency oscillations (HFO) in all depths of the CA1-dentate axis, while theta power changes depended on anatomical location and were independent of a transient increase of delta oscillations. Phase coherence of gamma and HFO increased across hippocampal layers. Phase-amplitude coupling between theta and fast oscillations was markedly altered in a dose-dependent manner: ketamine increased hippocampal theta-HFO coupling at all doses, while theta-gamma coupling increased at the lowest dose and was disrupted at the highest dose. Our results demonstrate that ketamine alters network interactions that underlie cognitively relevant theta-gamma coupling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybridisation is a systematic process along which the characteristic features of hybrid logic, both at the syntactic and the semantic levels, are developed on top of an arbitrary logic framed as an institution. It also captures the construction of first-order encodings of such hybridised institutions into theories in first-order logic. The method was originally developed to build suitable logics for the specification of reconfigurable software systems on top of whatever logic is used to describe local requirements of each system’s configuration. Hybridisation has, however, a broader scope, providing a fresh example of yet another development in combining and reusing logics driven by a problem from Computer Science. This paper offers an overview of this method, proposes some new extensions, namely the introduction of full quantification leading to the specification of dynamic modalities, and exemplifies its potential through a didactical application. It is discussed how hybridisation can be successfully used in a formal specification course in which students progress from equational to hybrid specifications in a uniform setting, integrating paradigms, combining data and behaviour, and dealing appropriately with systems evolution and reconfiguration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent developments in the physical parameterizations available in spectral wave models have already been validated, but there is little information on their relative performance especially with focus on the higher order spectral moments and wave partitions. This study concentrates on documenting their strengths and limitations using satellite measurements, buoy spectra, and a comparison between the different models. It is confirmed that all models perform well in terms of significant wave heights; however higher-order moments have larger errors. The partition wave quantities perform well in terms of direction and frequency but the magnitude and directional spread typically have larger discrepancies. The high-frequency tail is examined through the mean square slope using satellites and buoys. From this analysis it is clear that some models behave better than the others, suggesting their parameterizations match the physical processes reasonably well. However none of the models are entirely satisfactory, pointing to poorly constrained parameterizations or missing physical processes. The major space-time differences between the models are related to the swell field stressing the importance of describing its evolution. An example swell field confirms the wave heights can be notably different between model configurations while the directional distributions remain similar. It is clear that all models have difficulty in describing the directional spread. Therefore, knowledge of the source term directional distributions is paramount in improving the wave model physics in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relational reasoning, or the ability to identify meaningful patterns within any stream of information, is a fundamental cognitive ability associated with academic success across a variety of domains of learning and levels of schooling. However, the measurement of this construct has been historically problematic. For example, while the construct is typically described as multidimensional—including the identification of multiple types of higher-order patterns—it is most often measured in terms of a single type of pattern: analogy. For that reason, the Test of Relational Reasoning (TORR) was conceived and developed to include three other types of patterns that appear to be meaningful in the educational context: anomaly, antinomy, and antithesis. Moreover, as a way to focus on fluid relational reasoning ability, the TORR was developed to include, except for the directions, entirely visuo-spatial stimuli, which were designed to be as novel as possible for the participant. By focusing on fluid intellectual processing, the TORR was also developed to be fairly administered to undergraduate students—regardless of the particular gender, language, and ethnic groups they belong to. However, although some psychometric investigations of the TORR have been conducted, its actual fairness across those demographic groups has yet to be empirically demonstrated. Therefore, a systematic investigation of differential-item-functioning (DIF) across demographic groups on TORR items was conducted. A large (N = 1,379) sample, representative of the University of Maryland on key demographic variables, was collected, and the resulting data was analyzed using a multi-group, multidimensional item-response theory model comparison procedure. Using this procedure, no significant DIF was found on any of the TORR items across any of the demographic groups of interest. This null finding is interpreted as evidence of the cultural-fairness of the TORR, and potential test-development choices that may have contributed to that cultural-fairness are discussed. For example, the choice to make the TORR an untimed measure, to use novel stimuli, and to avoid stereotype threat in test administration, may have contributed to its cultural-fairness. Future steps for psychometric research on the TORR, and substantive research utilizing the TORR, are also presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Why are some companies more successful than others? This thesis approaches the question by enlisting theoretical frameworks that explain the performance with internal factors, deriving from the resource-based view, namely the dynamic capabilities approach. To deepen the understanding of the drivers and barriers towards developing these higher order routines aiming at improving the operational level routines, this thesis explores the organisational culture and identity research for the microfoundational antecedents that might shed light on the formation of the dynamic capabilities. The dynamic capabilities framework in this thesis strives to take the theoretical concept closer to practical applicability. This is achieved through creation of a dynamic capabilities matrix, consisting of four dimensions often encountered in dynamic capabilities literature. The quadrants are formed along internal-external and resources-abilities axes, and consist of Sensing, Learning, Reconfiguration and Partnering facets. A key element of this thesis is the reality continuum, which illustrates the different levels of reality inherent in any entity of human individuals. The theoretical framework constructed in the thesis suggests a link between the collective but constructivist understanding of the organisation and both the operational and higher level routines, evident in the more positivist realm. The findings from three different case organisations suggest that the constructivist assumptions inherent to an organisation function as a generative base for both drivers and barriers towards developing dynamic capabilities. From each organisation one core assumption is scrutinized to identify its connections to the four dimensions of the dynamic capabilities. These connections take the form of drivers or barriers – or have the possibility to develop into one or the other. The main contribution of this thesis is to show that one key for an organisation to perform well in a turbulent setting, is to understand the different levels of realities inherent in any group of people. Recognising the intangible levels gives an advantage in the tangible ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a two-dimensional Fermi-Pasta-Ulam (FPU) lattice with hexagonal symmetry. Using asymptotic methods based on small amplitude ansatz, at third order we obtain a eduction to a cubic nonlinear Schr{\"o}dinger equation (NLS) for the breather envelope. However, this does not support stable soliton solutions, so we pursue a higher-order analysis yielding a generalised NLS, which includes known stabilising terms. We present numerical results which suggest that long-lived stationary and moving breathers are supported by the lattice. We find breather solutions which move in an arbitrary direction, an ellipticity criterion for the wavenumbers of the carrier wave, symptotic estimates for the breather energy, and a minimum threshold energy below which breathers cannot be found. This energy threshold is maximised for stationary breathers, and becomes vanishingly small near the boundary of the elliptic domain where breathers attain a maximum speed. Several of the results obtained are similar to those obtained for the square FPU lattice (Butt \& Wattis, {\em J Phys A}, {\bf 39}, 4955, (2006)), though we find that the square and hexagonal lattices exhibit different properties in regard to the generation of harmonics, and the isotropy of the generalised NLS equation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using asymptotic methods, we investigate whether discrete breathers are supported by a two-dimensional Fermi-Pasta-Ulam lattice. A scalar (one-component) two-dimensional Fermi-Pasta-Ulam lattice is shown to model the charge stored within an electrical transmission lattice. A third-order multiple-scale analysis in the semi-discrete limit fails, since at this order, the lattice equations reduce to the (2+1)-dimensional cubic nonlinear Schrödinger (NLS) equation which does not support stable soliton solutions for the breather envelope. We therefore extend the analysis to higher order and find a generalised $(2+1)$-dimensional NLS equation which incorporates higher order dispersive and nonlinear terms as perturbations. We find an ellipticity criterion for the wave numbers of the carrier wave. Numerical simulations suggest that both stationary and moving breathers are supported by the system. Calculations of the energy show the expected threshold behaviour whereby the energy of breathers does {\em not} go to zero with the amplitude; we find that the energy threshold is maximised by stationary breathers, and becomes arbitrarily small as the boundary of the domain of ellipticity is approached.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This lecture course covers the theory of so-called duality-based a posteriori error estimation of DG finite element methods. In particular, we formulate consistent and adjoint consistent DG methods for the numerical approximation of both the compressible Euler and Navier-Stokes equations; in the latter case, the viscous terms are discretized based on employing an interior penalty method. By exploiting a duality argument, adjoint-based a posteriori error indicators will be established. Moreover, application of these computable bounds within automatic adaptive finite element algorithms will be developed. Here, a variety of isotropic and anisotropic adaptive strategies, as well as $hp$-mesh refinement will be investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I study how a larger party within a supply chain could use its superior knowledge about its partner, who is considered to be financially constrained, to help its partner gain access to cheap finance. In particular, I consider two scenarios: (i) Retailer intermediation in supplier finance and (ii) The Effectiveness of Supplier Buy Back Finance. In the fist chapter, I study how a large buyer could help small suppliers obtain financing for their operations. Especially in developing economies, traditional financing methods can be very costly or unavailable to such suppliers. In order to reduce channel costs, in recent years large buyers started to implement their own financing methods that intermediate between suppliers and financing institutions. In this paper, I analyze the role and efficiency of buyer intermediation in supplier financing. Building a game-theoretical model, I show that buyer intermediated financing can significantly improve supply chain performance. Using data from a large Chinese online retailer and through structural regression estimation based on the theoretical analysis, I demonstrate that buyer intermediation induces lower interest rates and wholesale prices, increases order quantities, and boosts supplier borrowing. The analysis also shows that the retailer systematically overestimates the consumer demand. Based on counterfactual analysis, I predict that the implementation of buyer intermediated financing for the online retailer in 2013 improved channel profits by 18.3%, yielding more than $68M projected savings. In the second chapter, I study a novel buy-back financing scheme employed by large manufacturers in some emerging markets. A large manufacturer can secure financing for its budget-constrained downstream partners by assuming a part of the risk for their inventory by committing to buy back some unsold units. Buy back commitment could help a small downstream party secure a bank loan and further induce a higher order quantity through better allocation of risk in the supply chain. However, such a commitment may undermine the supply chain performance as it imposes extra costs on the supplier incurred by the return of large or costly-to-handle items. I first theoretically analyze the buy-back financing contract employed by a leading Chinese automative manufacturer and some variants of this contracting scheme. In order to measure the effectiveness of buy-back financing contracts, I utilize contract and sales data from the company and structurally estimate the theoretical model. Through counterfactual analysis, I study the efficiency of various buy-back financing schemes and compare them to traditional financing methods. I find that buy-back contract agreements can improve channel efficiency significantly compared to simple contracts with no buy-back, whether the downstream retailer can secure financing on its own or not.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les septines sont des GTPases conservées dérégulées dans le cancer et les maladies neurodégénératives. Elles servent de protéines d’échafaudage et forment une barrière de diffusion à la membrane plasmique et au corps central lors de la cytokinèse. Elles interagissent avec l’actine et s’organisent en complexes qui polymérisent et forment des structures hautement organisées (anneaux et filaments). Leur dynamique d’assemblage et leur rôle dans la cellule restent à être élucidés. La Drosophile est un modèle simple pour l’étude des septines puisqu’on n’y retrouve que 5 gènes (sep1, sep2, sep4, sep5, peanut) comparativement aux 13 gènes chez l’humain. À l’aide d’un anticorps contre Pnut, nous avons identifié des structures tubulaires dans 30% des cellules S2 de Drosophile. Mon projet a comme but de caractériser ces tubes en élucidant leurs constituants, leur comportement et leurs propriétés pour mieux clarifier le mécanisme par lequel les septines forment des structures hautement organisées et interagissent avec le cytosquelette d’actine. Par immunofluorescence, j’ai pu démontrer que ces tubes sont cytoplasmiques, en mitose ou interphase, ce qui suggère qu’ils ne sont pas régulés par le cycle cellulaire. Pour investiguer la composition et les propriétés dynamiques de ces tubes, j’ai généré une lignée cellulaire exprimant Sep2-GFP qui se localise aux tubes et des ARNi contre les cinq septines. Trois septines sont importantes pour la formation de ces tubes et anneaux notamment Sep1, Sep2 et Pnut. La déplétion de Sep1 cause la dispersion du signal GFP en flocons, tandis que la déplétion de Sep2 ou de Pnut mène à la dispersion du signal GFP uniformément dans la cellule. Des expériences de FRAP sur la lignée Sep2-GFP révèlent un signal de retour très lent, ce qui indique que ces structures sont très stables. J’ai aussi démontré une relation entre l’actine et les septines. Le traitement avec la Latrunculin A (un inhibiteur de la polymérisation de l’actine) ou la Jasplakinolide (un stabilisateur des filaments d’actine) mène à la dépolymérisation rapide (< 30 min) des tubes en anneaux flottants dans le cytoplasme, même si ces tubes ne sont pas reconnus suite à un marquage de la F-actine. L’Actin05C-mCherry se localise aux tubes, tandis que le mutant déficient de la polymérisation, Actin05C-R62D-mCherry perd cette localisation. On observe aussi que la déplétion de la Cofiline et de l’AIP1 (ce qui déstabilise l’actine) mène au même phénotype que le traitement avec la Latrunculine A ou la Jasplakinolide. Alors on peut conclure qu’un cytosquelette d’actine dynamique est nécessaire pour la formation et le maintien des tubes de septines. Les futures études auront comme but de mieux comprendre l’organisation des septines en structures hautement organisées et leur relation avec l’actine. Ceci sera utile pour l’élaboration du réseau d’interactions des septines qui pourra servir à expliquer leur dérégulation dans le cancer et les maladies neurodégénératives.