922 resultados para 2D Convolutional Codes
Resumo:
Cette thèse rassemble une série de méta-analyses, c'est-à-dire d'analyses ayant pour objet des analyses produites par des sociologues (notamment celles résultant de l'application de méthodes de traitement des entretiens). Il s'agit d'une démarche réflexive visant les pratiques concrètes des sociologues. Celles-ci sont envisagées comme des activités gouvernées par des règles. Une part importante de cette thèse sera donc consacrée au développement d'un outil d'analyse « pragmatologique » (E. Durkheim), c'est-à-dire permettant l'étude des pratiques et des règles en rapport avec elles. Pour aborder les règles, la philosophie analytique d'inspiration wittgensteinienne apporte plusieurs propositions importantes. Les règles sont ainsi considérées comme des concepts d'air de famille : il n'y a pas de définitions communes recouvrant l'ensemble des règles. Pour étudier les règles, il convient alors de faire des distinctions à partir de leurs usages. Une de ces distinctions concerne la différence entre règles constitutives et règles régulatives : une règle constitutive crée une pratique (e.g. le mariage), alors qu'une règle régulative s'applique à des activités qui peuvent exister sans elle (e.g. les règles du savoir-vivre). L'activité méthodologique des sociologues repose et est contrainte par ces types de règles, qui sont pour l'essentiel implicites. Cette thèse vise donc à rendre compte, par la description et la codification des règles, du caractère normatif des méthodes dans les pratiques d'analyse de la sociologie. Elle insiste en particulier sur les limites logiques qu'instituent les règles constitutives, celles-ci rendant impossibles (et non pas interdites) certaines actions des sociologues. This thesis brings together a series of meta-analyzes, that is, analyzes that tackle analyzes produced by sociologists (notably those resulting from the application of methods in treating interviews). The approach is reflexive and aimed at the concrete practices of sociologists, considered as activities governed by rules. An important part of this thesis is therefore devoted to the development of a "pragmatological" analytical tool (Durkheim) to conduct a study of such practices and of the rules that govern them. To approach these rules, Wittgenstein-inspired analytic philosophy offers several important proposals. The rules are, at first, seen as concepts of family resemblance, assuming that there is no common definition accounting for all rules. In order to conduct the study of such rules, it is therefore necessary to discern how they are respectively used. One of these distinctions concerns the difference between constitutive rules and regulative rules: a constitutive rule creates a practice (for example marriage), while a regulative rule applies to activities that can exist outside of the rule (for example, the rules of etiquette). The methodological activity of sociologists relies on, and is constrained by these types of rules, which are essentially implicit. Through the description and codification of rules, this thesis aims to account for the normative character of methods governing analytical practices in sociology. Particular emphasis is on the logical limits established by constitutive rules, limits that render several of the sociologist's actions impossible (rather than forbidden).
Resumo:
The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.
Resumo:
In radionuclide metrology, Monte Carlo (MC) simulation is widely used to compute parameters associated with primary measurements or calibration factors. Although MC methods are used to estimate uncertainties, the uncertainty associated with radiation transport in MC calculations is usually difficult to estimate. Counting statistics is the most obvious component of MC uncertainty and has to be checked carefully, particularly when variance reduction is used. However, in most cases fluctuations associated with counting statistics can be reduced using sufficient computing power. Cross-section data have intrinsic uncertainties that induce correlations when apparently independent codes are compared. Their effect on the uncertainty of the estimated parameter is difficult to determine and varies widely from case to case. Finally, the most significant uncertainty component for radionuclide applications is usually that associated with the detector geometry. Recent 2D and 3D x-ray imaging tools may be utilized, but comparison with experimental data as well as adjustments of parameters are usually inevitable.
Resumo:
Abstract Objective: To compare the diagnostic performance of the three-dimensional turbo spin-echo (3D TSE) magnetic resonance imaging (MRI) technique with the performance of the standard two-dimensional turbo spin-echo (2D TSE) protocol at 1.5 T, in the detection of meniscal and ligament tears. Materials and Methods: Thirty-eight patients were imaged twice, first with a standard multiplanar 2D TSE MR technique, and then with a 3D TSE technique, both in the same 1.5 T MRI scanner. The patients underwent knee arthroscopy within the first three days after the MRI. Using arthroscopy as the reference standard, we determined the diagnostic performance and agreement. Results: For detecting anterior cruciate ligament tears, the 3D TSE and routine 2D TSE techniques showed similar values for sensitivity (93% and 93%, respectively) and specificity (80% and 85%, respectively). For detecting medial meniscal tears, the two techniques also had similar sensitivity (85% and 83%, respectively) and specificity (68% and 71%, respectively). In addition, for detecting lateral meniscal tears, the two techniques had similar sensitivity (58% and 54%, respectively) and specificity (82% and 92%, respectively). There was a substantial to almost perfect intraobserver and interobserver agreement when comparing the readings for both techniques. Conclusion: The 3D TSE technique has a diagnostic performance similar to that of the routine 2D TSE protocol for detecting meniscal and anterior cruciate ligament tears at 1.5 T, with the advantage of faster acquisition.
Resumo:
Past studies on the personnel selection demonstrated that a supervisor's advice to discriminate can lead to compliant behaviours. This study had the aim to extend past findings by examining what can overcome the powerful influence of the hierarchy. 50 Swiss managers participated to an in-basket exercise. The main task was to evaluate Swiss candidates (in-group) and foreigners (out-groups: Spanish and Kosovo Albanians) and to select two applicants for a job interview. Main results were the effect of codes of conduct to prevent discrimination against out-group applicants in the presence of a supervisor's advice to prefer in-group members. But, when participants were accountable to an audience, this beneficial effect disappears because participants followed the supervisor's advice. The second aim was to assess if the difference in responses between participants was related to their difference in moral attentiveness. Results showed some significant relationships but not always in the direction expected.
Resumo:
The synthesis and NMR analysis of seven new 4-(aryl)amino-5-carboethoxy-1,3-dimethyl-1H-pyrazolo[3,4- b]pyridines (7-13) are described. The synthetic approach used involved the preparation of intermediates 5-aminopyrazol (4), the enamine derivative (5) and the 4-chloro-1H-pyrazolo[3,4-b]pyridine (6). Compounds (7-13) were obtained by treatment of 6 with the desired aniline. The structures of new heterocyclic compounds and their precursors intermediates were assigned on the basis of spectral analysis including 1D and 2D NMR experiments [¹H; 13C{¹H} and DEPT; ¹H x ¹H - COSY; ¹H x13C - COSY, nJ CH, n = 1, 2 or 3 (HETECOR and COLOC)].
Resumo:
A view of the general aspects involving the 2D NMR spectroscopy using inverse detection and field gradient techniques is presented through the analysis of a sesquiterpene.
Resumo:
En aquest treball estudiem l’adquisició del català per part de parlants que tenen el panjabi com a primera llengua, concretament en relació amb les codes complexes de final de mot, per a aprenents d’un nivell inicial de català
Resumo:
The general methodology of classical trajectories as applied to elementary chemical reactions of the A+BC type is presented. The goal is to elucidate students about the main theoretical features and potentialities in applying this versatile method to calculate the dynamical properties of reactive systems. Only the methodology for two-dimensional (2D) case is described, from which the general theory for 3D follows straightforwardly. The adopted point of view is, as much as possible, that of allowing a direct translation of the concepts into a working program. An application to the reaction O(¹D)+H2->O+OH with relevance in atmospheric chemistry is also presented. The FORTRAN codes used are available through the web page www.qqesc.qui.uc.pt.
Resumo:
Steganography is an information hiding application which aims tohide secret data imperceptibly into a cover object. In this paper, we describe anovel coding method based on Z2Z4-additive codes in which data is embeddedby distorting each cover symbol by one unit at most (+-1-steganography). Thismethod is optimal and solves the problem encountered by the most e cientmethods known today, concerning the treatment of boundary values. Theperformance of this new technique is compared with that of the mentionedmethods and with the well-known rate-distortion upper bound to conclude thata higher payload can be obtained for a given distortion by using the proposedmethod.