991 resultados para Inherent chirality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

White micas in carbonate-rich tectonites and a few other rock types of large thrusts in the Swiss Helvetic fold-and-thrust belt have been analyzed by Ar-40/Ar-39 and Rb/Sr techniques to better constrain the timing of Alpine deformation for this region. Incremental Ar-40/Ar-39 heating experiments of 25 weakly metamorphosed (anchizone to low greenschist) samples yield plateau and staircase spectra. We interpret most of the staircase release spectra result from variable mixtures of syntectonic (neoformed) and detrital micas. The range in dates obtained within individual spectra depends primarily on the duration of mica nucleation and growth, and relative proportions of neoformed and detrital mica. Rb/Sr analyses of 12 samples yield dates of ca. 10-39 Ma (excluding one anomalously young sample). These dates are slightly younger than the Ar-40/Ar-39 total gas dates obtained for the same samples. The Rb/ Sr dates were calculated using initial Sr-87/Sr-86 ratios obtained from the carbonate-dominated host rocks, which are higher than normal Mesozoic carbonate values due to exchange with fluids of higher Sr-87/Sr-86 ratios (and lower O-18/O-16 ratios). Model dates calculated using Sr-87/Sr-86 values typical of Mesozoic marine carbonates more closely approximate the Ar-40/Ar-39 total gas dates for most of the samples. The similarities of Rb/Sr and Ar-40/Ar-39 total gas dates are consistent with limited amounts of detrital mica in the samples. The delta(18)O values range from 24-15%. (VSMOW) for 2-6 mum micas and 27-16parts per thousand for the carbonate host rocks. The carbonate values are significantly lower than their protolith values due to localized fluid-rock interaction and fluid flow along most thrust surfaces. Although most calcite-mica pairs are not in oxygen isotope equilibrium at temperatures of ca. 200-400 degreesC, their isotopic fractionations are indicative of either 1) partial exchange between the minerals and a common external fluid, or 2) growth or isotopic exchange of the mica with the carbonate after the carbonate had isotopically exchanged with an external fluid. The geological significance of these results is not easily or uniquely determined, and exemplifies the difficulties inherent in dating very fine-grained micas of highly deformed tectonites in low-grade metamorphic terranes. Two generalizations can be made regarding the dates obtained from the Helvetic thrusts: 1) samples from the two highest thrusts (Mt. Gond and Sublage) have all of their Ar-40/Ar-39 steps above 20 Ma, and 2) most samples from the deepest Helvetic thrusts have steps (often accounting for more than 80% of Ar-39 release) between 15 and 25 Ma. These dates are consistent with the order of thrusting in the foreland-imbricating system and increase proportions of neoformed to detrital mica in the more metamorphosed hinterland and deeply buried portions of the nappe pile. Individual thrusts accommodated the majority of their displacement during their initial incorporation into the foreland-imbricating system, and some thrusts remained active or were reactivated down to 15 Ma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Investigación elaborada a partir de una estancia en la Columbia Law School of New York, Estados Unidos, entre los meses de septiembre y noviembre del 2006. El Derecho comunitario europeo y español regulan la reparación de los daños causados por el uso o la proximidad de un producto defectuoso mediante gracias al impulso ideológico ejercido por la jurisprudencia norteamericana. El principio de responsabilidad objetiva rector de la directiva europea es fruto de un transfondo operado en los Estados Unidos en los años sesenta, coincidiendo con la revolución tecnològica y el inicio de la producción y del consumo masivos. Tales fenómenos suscitaron la búsqueda de mecanismos jurídicos aptos para canalizar la reparación de los daños inherentes a las actividades industriales tecnológicamente avanzadas. Su principal efecto fue la preocupación por una más justa distribución social de los llamados “costes del progreso”, preocupación que, jurídicamente, desembocó en la solución de la responsabilidad aun sin culpa del fabricante por los daños derivados de su producción industrial. El mérito de tal solución corresponde a determinados teóricos norteamericanos de la responsabilidad empresarial, quienes, inspirándose en ideas formuladas a inicios del siglo XX por los especialistas en Derecho laboral, concluyeron que es la empresa productora quien está en mejor situación de soportar el coste del accidente industrial: al imponerse al fabricante una responsabildad desvinculada de su eventual culpa en la causación del accidente, repercutirá en el precio de sus productos el coste del seguro de responsabilidad civil que se verá abocado a contratar para hacer frente a su responsabilidad objetiva o por riesgo, de manera que el coste de los accidentes acabará siendo soportado por el público consumidor al pagar el sobreprecio de los productos que adquiere. Las repercusiones de tal construcción han sido tanto normativas como judiciales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assays that measure a patient's immune response play an increasingly important role in the development of immunotherapies. The inherent complexity of these assays and independent protocol development between laboratories result in high data variability and poor reproducibility. Quality control through harmonization--based on integration of laboratory-specific protocols with standard operating procedures and assay performance benchmarks--is one way to overcome these limitations. Harmonization guidelines can be widely implemented to address assay performance variables. This process enables objective interpretation and comparison of data across clinical trial sites and also facilitates the identification of relevant immune biomarkers, guiding the development of new therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El sistema de modulación ODFM es utilizado en diversas aplicaciones de banda ancha, tanto en comunicaciones por cable como en aplicaciones inalámbricas. Presenta numerosas ventajas frente a sistemas de banda ancha de portadora única ya que permite una alta eficiencia espectral, una fácil ecualización y una reducción del ISI. Por el contrario, presenta dificultades inherentes a su estructura, que son de vital importancia solventar, entre las cuales se encuentran los altos requisitos de sincronización. Este proyecto presenta métodos de sincronización de tiempo y frecuencia implementados y evaluados sobre una plataforma software basada en Matlab®, que recoge el sistema completo de transmisión basado fielmente en el estándar DVB-T. Tras una presentación de los principios de la modulación OFDM, en este documento se presenta un estudio detallado de este sistema de transmisión y su implementación, formando conjuntamente una plataforma de simulación para la evaluación de los estimadores implementados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the theoretical macroeconomics literature, fiscal policy is almost uniformly taken to mean taxing and spending by a ‘benevolent government’ that exploits the potential aggregate demand externalities inherent in the imperfectly competitive nature of goods markets. Whilst shown to raise aggregate output and employment, these policies crowd-out private consumption and hence typically reduce welfare. In this paper we consider the use of ‘tax-and-subsidise’ instead of ‘taxand- spend’ policies on account of their widespread use by governments, even in the recent recession, to stimulate economic activity. Within a static general equilibrium macro-model with imperfectly competitive good markets we examine the effect of wage and output subsidies and show that, for a small open economy, positive tax and subsidy rates exist which maximise welfare, rendering no intervention as a suboptimal state. We also show that, within a two-country setting, a Nash non-cooperative symmetric equilibrium with positive tax and subsidy rates exists, and that cooperation between trading partners in setting these rates is more expansionary and leads to an improvement upon the non-cooperative solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyses RE macromodels from the methodological perspective. It proposes a particular property, robustness, which should be considered a necessary feature of scienti cally valid models in economics, but which is absent from many RE macromodels. To restore this property many macroeconomists resort to detailed and implausible assumptions, which take their models a long way from simple Rational Expectations. The paper draws attention to the problems inherent in the technique of local linearisation and concludes by proposing the use of nonlinear models, analysed globally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The selectivity of Bacillus thuringiensis toxins is determined both by the toxin structure and by factors inherent to the insect. These toxins contain distinct domains that appear to be functionally important in toxin binding to protein receptors in the midgut of susceptible insects, and the subsequent formation of a pore in the insect midgut epithelium. In this article features necessary for the insecticidal activity of these toxins are discussed. These include toxin structure, toxin processing in the insect midgut, the identification of toxin receptors in susceptible insects, and toxin pore formation in midgut cells. In addition a number of B. thuringiensis toxins act synergistically to exert their full insecticidal activity. This synergistic action is critical not only for expressing the insecticidal activity of these toxins, but could also play a role in delaying the onset of insect resistance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Indirect calorimetry based on respiratory exchange measurement has been successfully used from the beginning of the century to obtain an estimate of heat production (energy expenditure) in human subjects and animals. The errors inherent to this classical technique can stem from various sources: 1) model of calculation and assumptions, 2) calorimetric factors used, 3) technical factors and 4) human factors. The physiological and biochemical factors influencing the interpretation of calorimetric data include a change in the size of the bicarbonate and urea pools and the accumulation or loss (via breath, urine or sweat) of intermediary metabolites (gluconeogenesis, ketogenesis). More recently, respiratory gas exchange data have been used to estimate substrate utilization rates in various physiological and metabolic situations (fasting, post-prandial state, etc.). It should be recalled that indirect calorimetry provides an index of overall substrate disappearance rates. This is incorrectly assumed to be equivalent to substrate "oxidation" rates. Unfortunately, there is no adequate golden standard to validate whole body substrate "oxidation" rates, and this contrasts to the "validation" of heat production by indirect calorimetry, through use of direct calorimetry under strict thermal equilibrium conditions. Tracer techniques using stable (or radioactive) isotopes, represent an independent way of assessing substrate utilization rates. When carbohydrate metabolism is measured with both techniques, indirect calorimetry generally provides consistent glucose "oxidation" rates as compared to isotopic tracers, but only when certain metabolic processes (such as gluconeogenesis and lipogenesis) are minimal or / and when the respiratory quotients are not at the extreme of the physiological range. However, it is believed that the tracer techniques underestimate true glucose "oxidation" rates due to the failure to account for glycogenolysis in the tissue storing glucose, since this escapes the systemic circulation. A major advantage of isotopic techniques is that they are able to estimate (given certain assumptions) various metabolic processes (such as gluconeogenesis) in a noninvasive way. Furthermore when, in addition to the 3 macronutrients, a fourth substrate is administered (such as ethanol), isotopic quantification of substrate "oxidation" allows one to eliminate the inherent assumptions made by indirect calorimetry. In conclusion, isotopic tracers techniques and indirect calorimetry should be considered as complementary techniques, in particular since the tracer techniques require the measurement of carbon dioxide production obtained by indirect calorimetry. However, it should be kept in mind that the assessment of substrate oxidation by indirect calorimetry may involve large errors in particular over a short period of time. By indirect calorimetry, energy expenditure (heat production) is calculated with substantially less error than substrate oxidation rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AIM: Although acute pain is frequently reported by patients admitted to the emergency room, it is often insufficiently evaluated by physicians and is thus undertreated. With the aim of improving the care of adult patients with acute pain, we developed and implemented abbreviated clinical practice guidelines (CG) for the staff of nurses and physicians in our hospital's emergency room. METHODS: Our algorithm is based upon the practices described in the international literature and uses a simultaneous approach of treating acute pain in a rapid and efficacious manner along with diagnostic and therapeutic procedures. RESULTS: Pain was assessed using either a visual analogue scale (VAS) or a numerical rating scale (NRS) at ER admission and again during the hospital stay. Patients were treated with paracetamol and/or NSAID (VAS/NRS <4) or intravenous morphine (VAS/NRS > or =04). The algorithm also outlines a specific approach for patients with headaches to minimise the risks inherent to a non-specific treatment. In addition, our algorithm addresses the treatment of paroxysmal pain in patients with chronic pain as well as acute pain in drug addicts. It also outlines measures for pain prevention prior to minor diagnostic or therapeutic procedures. CONCLUSIONS: Based on published guidelines, an abbreviated clinical algorithm (AA) was developed and its simple format permitted a widespread implementation. In contrast to international guidelines, our algorithm favours giving nursing staff responsibility for decision making aspects of pain assessment and treatment in emergency room patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, we analyse the degree of polarisation-a concept fundamentally different from that of inequality-in the international distribution of CO2 emissions per capita in the European Union. It is analytically relevant to examine the degree of instability inherent to a distribution and, in the analysed case, the likelihood that the distribution and its evolution will increase or decrease the chances of reaching an agreement. Two approaches were used to measure polarisation: the endogenous approach, in which countries are grouped according to their similarity in terms of emissions, and the exogenous approach, in which countries are grouped geographically. Our findings indicate a clear decrease in polarisation since the mid-1990s, which can essentially be explained by the fact that the different groups of countries have converged (i.e. antagonism among the CO2 emitters has decreased) as the contribution of energy intensity to between-group differences has decreased. This lower degree of polarisation in CO2 distribution suggests a situation more conducive to the possibility of reaching EU-wide agreements on the mitigation of CO2 emissions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The young child's ability to go through a genuine mourning process has been a source of controversy in the psychoanalytical literature. This may seem surprising, considering that mourning is essential for and inherent to psychic development. This paper attempts to show that the young child's ability to go through a mourning process does not depend mainly on ego maturity, nor just on an acknowledgment of a loss in the external world, nor on the child's understanding the idea of death at an intellectual and cognitive level. But it may depend mainly on the establishment of the primordial mourning process inherent to the separation of the transnarcissistic mother-child relation and on the existence of the objectalizing function (A. Green, 1986) in the remaining or substitute parent's psyche. A clinical example serves to illustrate these hypotheses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ultrasound segmentation is a challenging problem due to the inherent speckle and some artifacts like shadows, attenuation and signal dropout. Existing methods need to include strong priors like shape priors or analytical intensity models to succeed in the segmentation. However, such priors tend to limit these methods to a specific target or imaging settings, and they are not always applicable to pathological cases. This work introduces a semi-supervised segmentation framework for ultrasound imaging that alleviates the limitation of fully automatic segmentation, that is, it is applicable to any kind of target and imaging settings. Our methodology uses a graph of image patches to represent the ultrasound image and user-assisted initialization with labels, which acts as soft priors. The segmentation problem is formulated as a continuous minimum cut problem and solved with an efficient optimization algorithm. We validate our segmentation framework on clinical ultrasound imaging (prostate, fetus, and tumors of the liver and eye). We obtain high similarity agreement with the ground truth provided by medical expert delineations in all applications (94% DICE values in average) and the proposed algorithm performs favorably with the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study I try to explain the systemic problem of the low economic competitiveness of nuclear energy for the production of electricity by carrying out a biophysical analysis of its production process. Given the fact that neither econometric approaches nor onedimensional methods of energy analyses are effective, I introduce the concept of biophysical explanation as a quantitative analysis capable of handling the inherent ambiguity associated with the concept of energy. In particular, the quantities of energy, considered as relevant for the assessment, can only be measured and aggregated after having agreed on a pre-analytical definition of a grammar characterizing a given set of finite transformations. Using this grammar it becomes possible to provide a biophysical explanation for the low economic competitiveness of nuclear energy in the production of electricity. When comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. Since the cost of production of fossil energy provides the base line of economic competitiveness of electricity, the (lack of) economic competitiveness of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. In particular, the analysis focuses on fossil-fuel requirements and labor requirements for those phases that both nuclear plants and fossil energy plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. By adopting this approach, it becomes possible to explain the systemic low economic competitiveness of nuclear energy in the production of electricity, because of: (i) its dependence on oil, limiting its possible role as a carbon-free alternative; (ii) the choices made in relation to its fuel cycle, especially whether it includes reprocessing operations or not; (iii) the unavoidable uncertainty in the definition of the characteristics of its process; (iv) its large inertia (lack of flexibility) due to issues of time scale; and (v) its low power level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

French verb morphology has always been a major challenge for learners as well as teachers of French as a foreign language. Learning difficulties arise not only from the inherent complexity of the conjugation system itself, but mostly from the traditional description found in specialized books, grammars, etc. French spelling alone tends to complexify the actual oral verb morphology by more than 60%, thus hindering efficient learning. Following Dubois (1967), Csécsy (1968), Pouradier Duteil (1997), etc., I suggest an alternative approach, exclusively based on phonetic transcription, and starting with plural forms instead of singular ones (Mayer 1969). For more than 500 verbs of the 2nd and 3rd groups, this strategy allows learners to first memorize the present tense plural form e.g. /illiz/ (ils lisent, "they read") and take the stem's final consonant away to get the singular /illi/ (il lit, "he reads").