994 resultados para weak key-IV combinations


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The title compound, {[Mn(C10H28N6)][Sn3Se7]}(n), consists of anionic (infinity){[Sn3Se7](2-)} layers interspersed by [Mn(peha)](2+) complex cations ( peha is pentaethylenehexamine). Pseudo-cubic (Sn3Se4) cluster units within each layer are held together to form a 6(3) net with a hole size of 8.74 x 13.87 angstrom. Weak N-H center dot center dot center dot Se interactions between the host inorganic frameworks and metal complexes extend the components into a three-dimensional network. The incorporation of metal complexes into the flexible anion layer dictates the distortion of the holes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method for localization and positioning in an indoor environment is presented. The method is based on representing the scene as a set of 2D views and predicting the appearances of novel views by linear combinations of the model views. The method is accurate under weak perspective projection. Analysis of this projection as well as experimental results demonstrate that in many cases it is sufficient to accurately describe the scene. When weak perspective approximation is invalid, an iterative solution to account for the perspective distortions can be employed. A simple algorithm for repositioning, the task of returning to a previously visited position defined by a single view, is derived from this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Weak references are references that do not prevent the object they point to from being garbage collected. Most realistic languages, including Java, SML/NJ, and OCaml to name a few, have some facility for programming with weak references. Weak references are used in implementing idioms like memoizing functions and hash-consing in order to avoid potential memory leaks. However, the semantics of weak references in many languages are not clearly specified. Without a formal semantics for weak references it becomes impossible to prove the correctness of implementations making use of this feature. Previous work by Hallett and Kfoury extends λgc, a language for modeling garbage collection, to λweak, a similar language with weak references. Using this previously formalized semantics for weak references, we consider two issues related to well-behavedness of programs. Firstly, we provide a new, simpler proof of the well-behavedness of the syntactically restricted fragment of λweak defined previously. Secondly, we give a natural semantic criterion for well-behavedness much broader than the syntactic restriction, which is useful as principle for programming with weak references. Furthermore we extend the result, proved in previously of λgc, which allows one to use type-inference to collect some reachable objects that are never used. We prove that this result holds of our language, and we extend this result to allow the collection of weakly-referenced reachable garbage without incurring the computational overhead sometimes associated with collecting weak bindings (e.g. the need to recompute a memoized function). Lastly we use extend the semantic framework to model the key/value weak references found in Haskell and we prove the Haskell is semantics equivalent to a simpler semantics due to the lack of side-effects in our language.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis focuses on the synthesis and analysis of novel chloride based platinum complexes derived from iminophosphine and phosphinoamide ligands, along with studies on their reactivity towards substitution and oxidation reactions. Also explored here are the potential applications of these complexes for biological and luminescent purposes. Chapter one provides an extensive overview of platinum coordination chemistry with examples of various mixed donor ligands along with the history of platinum anticancer therapy. It also looks at metals in medicine, both for biological functions as well as for therapeutic purposes and gives a background to some other applications for platinum complexes. Chapter two outlines the design and synthetic strategies employed for the development of novel platinum (II) chloride complexes from iminophosphine and phosphinoamide ligands. Also reported is the cyclometallation of these complexes to form stable tridentate mixed donor platinum (II) compounds. In Chapter three the development of a direct method for displacing a chloride from a platinum metal centre with a desired phosphine is reported. Numerous methods for successful oxidation of the platinum (II) complexes will also be explored, leading to novel platinum (IV) complexes being reported here also. The importance of stabilisation of the displaced anion, chloride, by the solvent system will also be discussed in this chapter. Chapter four investigates the reactivity of the platinum (II) complexes towards two different biomolecules to form novel platinum bio-adducts. The potential application of the platinum (II) cyclometallates as chemotherapeutics will also be explored here using in-vitro cancer cell testing. Finally, luminescence studies are also reported here for the ligands and platinum complexes reported in chapter two and three to investigate potential applications in this field also. Chapter five provides a final conclusion and an overall summary of the entire project as well as identifying key areas for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Along with the growing demand for cryptosystems in systems ranging from large servers to mobile devices, suitable cryptogrophic protocols for use under certain constraints are becoming more and more important. Constraints such as calculation time, area, efficiency and security, must be considered by the designer. Elliptic curves, since their introduction to public key cryptography in 1985 have challenged established public key and signature generation schemes such as RSA, offering more security per bit. Amongst Elliptic curve based systems, pairing based cryptographies are thoroughly researched and can be used in many public key protocols such as identity based schemes. For hardware implementions of pairing based protocols, all components which calculate operations over Elliptic curves can be considered. Designers of the pairing algorithms must choose calculation blocks and arrange the basic operations carefully so that the implementation can meet the constraints of time and hardware resource area. This thesis deals with different hardware architectures to accelerate the pairing based cryptosystems in the field of characteristic two. Using different top-level architectures the hardware efficiency of operations that run at different times is first considered in this thesis. Security is another important aspect of pairing based cryptography to be considered in practically Side Channel Analysis (SCA) attacks. The naively implemented hardware accelerators for pairing based cryptographies can be vulnerable when taking the physical analysis attacks into consideration. This thesis considered the weaknesses in pairing based public key cryptography and addresses the particular calculations in the systems that are insecure. In this case, countermeasures should be applied to protect the weak link of the implementation to improve and perfect the pairing based algorithms. Some important rules that the designers must obey to improve the security of the cryptosystems are proposed. According to these rules, three countermeasures that protect the pairing based cryptosystems against SCA attacks are applied. The implementations of the countermeasures are presented and their performances are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autobiographical memories of trauma victims are often described as disturbed in two ways. First, the trauma is frequently re-experienced in the form of involuntary, intrusive recollections. Second, the trauma is difficult to recall voluntarily (strategically); important parts may be totally or partially inaccessible-a feature known as dissociative amnesia. These characteristics are often mentioned by PTSD researchers and are included as PTSD symptoms in the DSM-IV-TR (American Psychiatric Association, 2000). In contrast, we show that both involuntary and voluntary recall are enhanced by emotional stress during encoding. We also show that the PTSD symptom in the diagnosis addressing dissociative amnesia, trouble remembering important aspects of the trauma is less well correlated with the remaining PTSD symptoms than the conceptual reversal of having trouble forgetting important aspects of the trauma. Our findings contradict key assumptions that have shaped PTSD research over the last 40 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background In recent years there has been an increase in the provision of conscious sedation, which is said to be a safe and effective means of managing the anxious patient. However, there are no guidelines to aid the dental practitioner in assessing the patient's need for sedation based on their level of anxiety.

Aims and methods The present study investigated the importance of patient anxiety as an indicator for IV sedation, using focus groups to inform the development of narrative vignettes. Ninety-nine practitioners responded to a series of scenarios to determine whether the level of patient anxiety and the patient's demand for IV sedation influenced their decision making.

Results Level of dental anxiety had a stronger influence on the clinician's decision making than patient demand, with increasing levels of dental anxiety being positively associated with the likelihood of clinicians indicating a need for IV patient sedation and also, the likelihood of clinicians providing IV sedation to these patients. Only 14% (n = 14) of respondents reported formally assessing dental anxiety.

Conclusions While dental anxiety is considered to be a key factor in determining the need for IV sedation, there is a lack of guidance regarding the assessment of anxiety among patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report calculations for energy levels, radiative rates and electron impact excitation rates for transitions in He-like Li II, Be III, B IV and C V. grasp (general-purpose relativistic atomic structure package) is adopted for calculating energy levels and radiative rates. For determining the collision strengths and subsequently the excitation rates, the Dirac atomic R-matrix code (darc) is used. Oscillator strengths, radiative rates and line strengths are reported for all E1, E2, M1 and M2 transitions among the lowest 49 levels of each ion. Collision strengths have been averaged over a Maxwellian velocity distribution and the effective collision strengths so obtained are reported over a wide temperature range up to 10(6) K. Comparisons have been made with similar data obtained from the flexible atomic code (FAC) to highlight the importance of resonances, included in calculations from darc, in the determination of effective collision strengths. Discrepancies between the collision strengths from darc and fac, particularly for weak transitions and at low energies, have also been discussed. Additionally, lifetimes are also listed for all calculated levels of the above four ions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Porous layered hybrid materials have been prepared by the reaction of organo-bisphosphonate ligands, 4-(4'-phosphonophenoxy)phenylphosphonic, 4,4'-biphenylenbisphosphonic and phenylphosphonic acids, with metal(IV) cations (Zr and Sn). Crystalline Zr(IV) and Sn(IV) layered bisphosphonates were also prepared, which were non-porous. The amorphous M(IV) bisphosphonates showed variable compositions and textural properties ranging from mainly mesoporous to highly microporous solids with BET surface areas varying from 300 to 480 m(2) g(-1), micropore volumes ranging 0.10-0.20 cm(3)/g, and narrow porous size distributions for some materials. N-2 isotherms suggest that Sn(IV) derivatives show a comparatively higher micropore contribution than the Zr(IV) analogous at least for the ether-bisphosphonate hybrids. Sn(IV) bisphosphonates exhibit high microporosities without the need of using harmful DMSO as solvent. If ether-bisphosphonic acid is partially replaced by less expensive phenylphosphonic ligand, porous products are also obtained. P-31 and F-17 MAS NMR and XPS data revealed the presence of hydrogen-phosphonate groups and small (F-, Cl- and OH-) anions, which act as spacer ligands within the inorganic layers, in these hybrid materials. The complexity of the inorganic layers is higher for the Sn(IV) bisphosphonates likely due to the larger amount of small bridging anions including fluorides. It is suggested that the presence of these small inorganic ligands may be a key factor influencing both, the interaction of the inorganic layer with the bisphosphonate groups, which bridge the inorganic layers, and the generation of internal voids within a given inorganic layer. Preliminary studies of gases adsorption (H-2 and NO) have been carried out for selected Sn(IV) bisphosphonates. The H-2 adsorption capacity at 77 K and 1 bar was low, 0.26 wt%, but the NO adsorption capacity at similar to 1 bar and 298 K was relatively high, 4.2 wt%. Moreover, the hysteresis in the NO isotherms is indicative of partial strong irreversible adsorption of NO. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thriving and well-established field of Law and Society (also referred to as Sociolegal Studies) has diverse methodological influences; it draws on social-scientific and arts-based methods. The approach of scholars researching and teaching in the field often crosses disciplinary borders, but, broadly speaking, Law and Society scholarship goes behind formalism to investigate how and why law operates, or does not operate as intended, in society. By exploring law’s connections with broader social and political forces—both domestic and international—scholars gain valuable perspectives on ideology, culture, identity, and social life. Law and Society scholarship considers both the law in contexts, as well as contexts in law.
Law and Society flourishes today, perhaps as never before. Academic thinkers toil both on the mundane and the local, as well as the global, making major advances in the ways in which we think both about law and society. Especially over the last four decades, scholarly output has rapidly burgeoned, and this new title from Routledge’s acclaimed Critical Concepts in Law series answers the need for an authoritative reference collection to help users make sense of the daunting quantity of serious research and thinking.
Edited by the leading scholars in the field, Law and Society brings together in four volumes the vital classic and contemporary contributions. Volume I is dedicated to historical antecedents and precursors. The second volume covers methodologies and crucial themes. The third volume assembles key works on legal processes and professional groups, while the final volume of the collection focuses on substantive areas. Together, the volumes provide a one-stop ‘mini library’ enabling all interested researchers, teachers, and students to explore the origins of this thriving sub discipline, and to gain a thorough understanding of where it is today.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design and VLSI implementation of two key components of the class-IV partial response maximum likelihood channel (PR-IV) the adaptive filter and the Viterbi decoder are described. These blocks are implemented using parameterised VHDL modules, from a library of common digital signal processing (DSP) and arithmetic functions. Design studies, based on 0.6 micron 3.3V standard cell processes, indicate that worst case sampling rates of 49 mega-samples per second are achievable for this system, with proportionally high sampling rates for full custom designs and smaller dimension processes. Significant increases in the sampling rate, from 49 MHz to approximately 180 MHz, can be achieved by operating four filter modules in parallel, and this implementation has 50% lower power consumption than a pipelined filter operating at the same speed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A thin-layer chromatography (TLC)-bioautographic method was developed with the aim to detect dipeptidyl peptidase IV (DPP IV) inhibitors from plant extracts. The basic principle of the method is that the enzyme (DPP IV) hydrolyzes substrate (Gly-Pro-p-nitroaniline) into p-nitroaniline (pNA), which diazotizes with sodium nitrite, and then reacts with N-(1-naphthyl) ethylenediamine dihydrochloride in turn to form a rose-red azo dye which provides a rose-red background on the TLC plates. The DPP IV inhibitors showed white spots on the background as they blocked enzymolysis of the substrate to produce pNA. The method was validated with respect to selectivity, sensitivity, linearity, precision, recovery, and stability after optimizing key parameters including plate type, time and temperature of incubation, concentration of substrate, enzyme and derivatization reagents, and absorption wavelength. The results showed good lineary within amounts over 0.01–0.1 μg range for the positive control, diprotin A, with the coefficient of determination (r2) = 0.9668. The limits of detection (LOD) and quantification (LOQ) were 5 and 10 ng, respectively. The recoveries ranged from 98.9% to 107.5%. The averages of the intra- and inter-plate reproducibility were in the range of 4.1–9.7% and 7.6–14.7%, respectively. Among the nine methanolic extracts of medicinal herbs screened for DPP IV inhibitors by the newly developed method, Peganum nigellastrum Bunge was found to have one white active spot, which was then isolated and identified as harmine. By spectrophotometric method, harmine hydrochloride was found to have DPP-IV inhibitory activity of 32.4% at 10 mM comparing to that of 54.8% at 50 μM for diprotin A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ARTICLE 1 : RÉSUMÉ Amputation traumatique: Une étude de cas laotien sur l’indignation et l’injustice. La culture est un contexte essentiel à considérer pour produire un diagnostic et un plan d’intervention psychiatrique. Une perspective culturelle met en relief le contexte social dans lequel les symptômes émergent, et comment ils sont interprétés et gérés par la personne atteinte. Des études ethnoculturelles sur les maladies nous suggèrent que la plupart des gens nous donnent des explications pour leurs symptômes qui ont un fondement culturel. Bien que ces explications contredisent la théorie biomédicale, elles soulagent la souffrance des patients et leur permettent de donner une signification à cette dernière. L’exploration des caractéristiques, contextes et antécédents des symptômes permet au patient de les communiquer au clinicien qui pourrait avoir une explication différente de sa maladie. Cette étude de cas permet de montrer comment le Guide pour Formulation Culturelle du DSM-IV (The DSM-IV Outline for Cultural Formulation) permet aux cliniciens de solliciter un récit du patient en lien avec son expérience de la maladie. Notre étude examine l’utilisation par un patient laotien de « l’indignation sociale » (« Khuâm khum khang ») comme le modèle explicatif culturel de son problème malgré le diagnostic de trouble de stress post-traumatique qui lui fut attribué après une amputation traumatique. L’explication culturelle de son problème a permis au patient d’exprimer la signification personnelle et collective à sa colère et sa frustration, émotions qu’il avait réprimées. Cet idiome culturel lui a permis d’exprimer sa détresse et de réfléchir sur le système de soins de santé et, plus précisément, le contexte dans lequel les symptômes et leurs origines sont racontés et évalués. Cette représentation laotienne a aussi permis aux cliniciens de comprendre des expériences et les explications du client, autrement difficiles à situer dans un contexte biomédical et psychiatrique Euro-américain. Cette étude démontre comment il est possible d’améliorer les interactions entre cliniciens et patients et dès lors la qualité des soins par la compréhension de la perspective du patient et l’utilisation d’une approche culturelle. Mots clés: Culture, signification, idiome culturel, modèle explicatif, Guide pour Formulation culturelle du DSM-IV, indignation sociale, interaction entre patient et intervenant. ARTICLE 2 : RÉSUMÉ Impact de l’utilisation du Guide pour la formulation culturelle du DSM-IV sur la dynamique de conférences multidisciplinaires en santé mentale. La croissance du pluralisme culturel en Amérique du nord a obligé la communauté oeuvrant en santé mentale d’adopter une sensibilité culturelle accrue dans l’exercice de leur métier. Les professionnels en santé mentale doivent prendre conscience du contexte historique et social non seulement de leur clientèle mais également de leur propre profession. Les renseignements exigés pour les soins professionnels proviennent d’ évaluations cliniques. Il faut examiner ces informations dans un cadre culturellement sensible pour pouvoir formuler une évaluation des cas qui permet aux cliniciens de poser un diagnostic juste et précis, et ce, à travers les frontières culturelles du patient aussi bien que celles du professionnel en santé mentale. Cette situation a suscité le développement du Guide pour la formulation culturelle dans la 4ième édition du Manuel diagnostique et statistique des troubles mentaux américain (Diagnostic and Statistical Manual of Mental Disorders (4th ed., DSM-IV) de l’Association psychiatrique américaine. Ce guide est un outil pour aider les cliniciens à obtenir des informations de nature culturelle auprès du client et de sa famille afin de guider la production des soins en santé mentale. L’étude vise l’analyse conversationnelle de la conférence multidisciplinaire comme contexte d’utilisation du Guide pour la formulation culturelle qui sert de cadre dans lequel les pratiques discursives des professionnels de la santé mentale évoluent. Utilisant la perspective théorique de l’interactionnisme symbolique, l’étude examine comment les diverses disciplines de la santé mentale interprètent et conceptualisent les éléments culturels et les implications de ce cadre pour la collaboration interdisciplinaire dans l’évaluation, l’élaboration de plans de traitement et des soins. Mots clé: Guide pour Formulation culturelle – Santé mentale – Psychiatrie transculturelle – Analyse conversationnelle – Interactionnisme symbolique

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La dernière décennie a connu un intérêt croissant pour les problèmes posés par les variables instrumentales faibles dans la littérature économétrique, c’est-à-dire les situations où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter. En effet, il est bien connu que lorsque les instruments sont faibles, les distributions des statistiques de Student, de Wald, du ratio de vraisemblance et du multiplicateur de Lagrange ne sont plus standard et dépendent souvent de paramètres de nuisance. Plusieurs études empiriques portant notamment sur les modèles de rendements à l’éducation [Angrist et Krueger (1991, 1995), Angrist et al. (1999), Bound et al. (1995), Dufour et Taamouti (2007)] et d’évaluation des actifs financiers (C-CAPM) [Hansen et Singleton (1982,1983), Stock et Wright (2000)], où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter, ont montré que l’utilisation de ces statistiques conduit souvent à des résultats peu fiables. Un remède à ce problème est l’utilisation de tests robustes à l’identification [Anderson et Rubin (1949), Moreira (2002), Kleibergen (2003), Dufour et Taamouti (2007)]. Cependant, il n’existe aucune littérature économétrique sur la qualité des procédures robustes à l’identification lorsque les instruments disponibles sont endogènes ou à la fois endogènes et faibles. Cela soulève la question de savoir ce qui arrive aux procédures d’inférence robustes à l’identification lorsque certaines variables instrumentales supposées exogènes ne le sont pas effectivement. Plus précisément, qu’arrive-t-il si une variable instrumentale invalide est ajoutée à un ensemble d’instruments valides? Ces procédures se comportent-elles différemment? Et si l’endogénéité des variables instrumentales pose des difficultés majeures à l’inférence statistique, peut-on proposer des procédures de tests qui sélectionnent les instruments lorsqu’ils sont à la fois forts et valides? Est-il possible de proposer les proédures de sélection d’instruments qui demeurent valides même en présence d’identification faible? Cette thèse se focalise sur les modèles structurels (modèles à équations simultanées) et apporte des réponses à ces questions à travers quatre essais. Le premier essai est publié dans Journal of Statistical Planning and Inference 138 (2008) 2649 – 2661. Dans cet essai, nous analysons les effets de l’endogénéité des instruments sur deux statistiques de test robustes à l’identification: la statistique d’Anderson et Rubin (AR, 1949) et la statistique de Kleibergen (K, 2003), avec ou sans instruments faibles. D’abord, lorsque le paramètre qui contrôle l’endogénéité des instruments est fixe (ne dépend pas de la taille de l’échantillon), nous montrons que toutes ces procédures sont en général convergentes contre la présence d’instruments invalides (c’est-à-dire détectent la présence d’instruments invalides) indépendamment de leur qualité (forts ou faibles). Nous décrivons aussi des cas où cette convergence peut ne pas tenir, mais la distribution asymptotique est modifiée d’une manière qui pourrait conduire à des distorsions de niveau même pour de grands échantillons. Ceci inclut, en particulier, les cas où l’estimateur des double moindres carrés demeure convergent, mais les tests sont asymptotiquement invalides. Ensuite, lorsque les instruments sont localement exogènes (c’est-à-dire le paramètre d’endogénéité converge vers zéro lorsque la taille de l’échantillon augmente), nous montrons que ces tests convergent vers des distributions chi-carré non centrées, que les instruments soient forts ou faibles. Nous caractérisons aussi les situations où le paramètre de non centralité est nul et la distribution asymptotique des statistiques demeure la même que dans le cas des instruments valides (malgré la présence des instruments invalides). Le deuxième essai étudie l’impact des instruments faibles sur les tests de spécification du type Durbin-Wu-Hausman (DWH) ainsi que le test de Revankar et Hartley (1973). Nous proposons une analyse en petit et grand échantillon de la distribution de ces tests sous l’hypothèse nulle (niveau) et l’alternative (puissance), incluant les cas où l’identification est déficiente ou faible (instruments faibles). Notre analyse en petit échantillon founit plusieurs perspectives ainsi que des extensions des précédentes procédures. En effet, la caractérisation de la distribution de ces statistiques en petit échantillon permet la construction des tests de Monte Carlo exacts pour l’exogénéité même avec les erreurs non Gaussiens. Nous montrons que ces tests sont typiquement robustes aux intruments faibles (le niveau est contrôlé). De plus, nous fournissons une caractérisation de la puissance des tests, qui exhibe clairement les facteurs qui déterminent la puissance. Nous montrons que les tests n’ont pas de puissance lorsque tous les instruments sont faibles [similaire à Guggenberger(2008)]. Cependant, la puissance existe tant qu’au moins un seul instruments est fort. La conclusion de Guggenberger (2008) concerne le cas où tous les instruments sont faibles (un cas d’intérêt mineur en pratique). Notre théorie asymptotique sous les hypothèses affaiblies confirme la théorie en échantillon fini. Par ailleurs, nous présentons une analyse de Monte Carlo indiquant que: (1) l’estimateur des moindres carrés ordinaires est plus efficace que celui des doubles moindres carrés lorsque les instruments sont faibles et l’endogenéité modérée [conclusion similaire à celle de Kiviet and Niemczyk (2007)]; (2) les estimateurs pré-test basés sur les tests d’exogenété ont une excellente performance par rapport aux doubles moindres carrés. Ceci suggère que la méthode des variables instrumentales ne devrait être appliquée que si l’on a la certitude d’avoir des instruments forts. Donc, les conclusions de Guggenberger (2008) sont mitigées et pourraient être trompeuses. Nous illustrons nos résultats théoriques à travers des expériences de simulation et deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le problème bien connu du rendement à l’éducation. Le troisième essai étend le test d’exogénéité du type Wald proposé par Dufour (1987) aux cas où les erreurs de la régression ont une distribution non-normale. Nous proposons une nouvelle version du précédent test qui est valide même en présence d’erreurs non-Gaussiens. Contrairement aux procédures de test d’exogénéité usuelles (tests de Durbin-Wu-Hausman et de Rvankar- Hartley), le test de Wald permet de résoudre un problème courant dans les travaux empiriques qui consiste à tester l’exogénéité partielle d’un sous ensemble de variables. Nous proposons deux nouveaux estimateurs pré-test basés sur le test de Wald qui performent mieux (en terme d’erreur quadratique moyenne) que l’estimateur IV usuel lorsque les variables instrumentales sont faibles et l’endogénéité modérée. Nous montrons également que ce test peut servir de procédure de sélection de variables instrumentales. Nous illustrons les résultats théoriques par deux applications empiriques: le modèle bien connu d’équation du salaire [Angist et Krueger (1991, 1999)] et les rendements d’échelle [Nerlove (1963)]. Nos résultats suggèrent que l’éducation de la mère expliquerait le décrochage de son fils, que l’output est une variable endogène dans l’estimation du coût de la firme et que le prix du fuel en est un instrument valide pour l’output. Le quatrième essai résout deux problèmes très importants dans la littérature économétrique. D’abord, bien que le test de Wald initial ou étendu permette de construire les régions de confiance et de tester les restrictions linéaires sur les covariances, il suppose que les paramètres du modèle sont identifiés. Lorsque l’identification est faible (instruments faiblement corrélés avec la variable à instrumenter), ce test n’est en général plus valide. Cet essai développe une procédure d’inférence robuste à l’identification (instruments faibles) qui permet de construire des régions de confiance pour la matrices de covariances entre les erreurs de la régression et les variables explicatives (possiblement endogènes). Nous fournissons les expressions analytiques des régions de confiance et caractérisons les conditions nécessaires et suffisantes sous lesquelles ils sont bornés. La procédure proposée demeure valide même pour de petits échantillons et elle est aussi asymptotiquement robuste à l’hétéroscédasticité et l’autocorrélation des erreurs. Ensuite, les résultats sont utilisés pour développer les tests d’exogénéité partielle robustes à l’identification. Les simulations Monte Carlo indiquent que ces tests contrôlent le niveau et ont de la puissance même si les instruments sont faibles. Ceci nous permet de proposer une procédure valide de sélection de variables instrumentales même s’il y a un problème d’identification. La procédure de sélection des instruments est basée sur deux nouveaux estimateurs pré-test qui combinent l’estimateur IV usuel et les estimateurs IV partiels. Nos simulations montrent que: (1) tout comme l’estimateur des moindres carrés ordinaires, les estimateurs IV partiels sont plus efficaces que l’estimateur IV usuel lorsque les instruments sont faibles et l’endogénéité modérée; (2) les estimateurs pré-test ont globalement une excellente performance comparés à l’estimateur IV usuel. Nous illustrons nos résultats théoriques par deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le modèle de rendements à l’éducation. Dans la première application, les études antérieures ont conclu que les instruments n’étaient pas trop faibles [Dufour et Taamouti (2007)] alors qu’ils le sont fortement dans la seconde [Bound (1995), Doko et Dufour (2009)]. Conformément à nos résultats théoriques, nous trouvons les régions de confiance non bornées pour la covariance dans le cas où les instruments sont assez faibles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diabetes mellitus is a heterogeneous metabolic disorder characterized by hyperglycemia with disturbances in carbohydrate, protein and lipid metabolism resulting from defects in insulin secretion, insulin action or both. Currently there are 387 million people with diabetes worldwide and is expected to affect 592 million people by 2035. Insulin resistance in peripheral tissues and pancreatic beta cell dysfunction are the major challenges in the pathophysiology of diabetes. Diabetic secondary complications (like liver cirrhosis, retinopathy, microvascular and macrovascular complications) arise from persistent hyperglycemia and dyslipidemia can be disabling or even life threatening. Current medications are effective for control and management of hyperglycemia but undesirable effects, inefficiency against secondary complications and high cost are still serious issues in the present prognosis of this disorder. Hence the search for more effective and safer therapeutic agents of natural origin has been found to be highly demanding and attract attention in the present drug discovery research. The data available from Ayurveda on various medicinal plants for treatment of diabetes can efficiently yield potential new lead as antidiabetic agents. For wider acceptability and popularity of herbal remedies available in Ayurveda scientific validation by the elucidation of mechanism of action is very much essential. Modern biological techniques are available now to elucidate the biochemical basis of the effectiveness of these medicinal plants. Keeping this idea the research programme under this thesis has been planned to evaluate the molecular mechanism responsible for the antidiabetic property of Symplocos cochinchinensis, the main ingredient of Nishakathakadi Kashayam, a wellknown Ayurvedic antidiabetic preparation. A general introduction of diabetes, its pathophysiology, secondary complications and current treatment options, innovative solutions based on phytomedicine etc has been described in Chapter 1. The effect of Symplocos cochinchinensis (SC), on various in vitro biochemical targets relevant to diabetes is depicted in Chapter 2 including the preparation of plant extract. Since diabetes is a multifactorial disease, ethanolic extract of the bark of SC (SCE) and its fractions (hexane, dichloromethane, ethyl acetate and 90 % ethanol) were evaluated by in vitro methods against multiple targets such as control of postprandial hyperglycemia, insulin resistance, oxidative stress, pancreatic beta cell proliferation, inhibition of protein glycation, protein tyrosine phosphatase-1B (PTP-1B) and dipeptidyl peptidase-IV (DPPxxi IV). Among the extracts, SCE exhibited comparatively better activity like alpha glucosidase inhibition, insulin dependent glucose uptake (3 fold increase) in L6 myotubes, pancreatic beta cell regeneration in RIN-m5F and reduced triglyceride accumulation in 3T3-L1 cells, protection from hyperglycemia induced generation of reactive oxygen species in HepG2 cells with moderate antiglycation and PTP-1B inhibition. Chemical characterization by HPLC revealed the superiority of SCE over other extracts due to presence of bioactives (beta-sitosterol, phloretin 2’glucoside, oleanolic acid) in addition to minerals like magnesium, calcium, potassium, sodium, zinc and manganese. So SCE has been subjected to oral sucrose tolerance test (OGTT) to evaluate its antihyperglycemic property in mild diabetic and diabetic animal models. SCE showed significant antihyperglycemic activity in in vivo diabetic models. Chapter 3 highlights the beneficial effects of hydroethanol extract of Symplocos cochinchinensis (SCE) against hyperglycemia associated secondary complications in streptozotocin (60 mg/kg body weight) induced diabetic rat model. Proper sanction had been obtained for all the animal experiments from CSIR-CDRI institutional animal ethics committee. The experimental groups consist of normal control (NC), N + SCE 500 mg/kg bwd, diabetic control (DC), D + metformin 100 mg/kg bwd, D + SCE 250 and D + SCE 500. SCEs and metformin were administered daily for 21 days and sacrificed on day 22. Oral glucose tolerance test, plasma insulin, % HbA1c, urea, creatinine, aspartate aminotransferase (AST), alanine aminotransferase (ALT), albumin, total protein etc. were analysed. Aldose reductase (AR) activity in the eye lens was also checked. On day 21, DC rats showed significantly abnormal glucose response, HOMA-IR, % HbA1c, decreased activity of antioxidant enzymes and GSH, elevated AR activity, hepatic and renal oxidative stress markers compared to NC. DC rats also exhibited increased level of plasma urea and creatinine. Treatment with SCE protected from the deleterious alterations of biochemical parameters in a dose dependent manner including histopathological alterations in pancreas. SCE 500 exhibited significant glucose lowering effect and decreased HOMA-IR, % HbA1c, lens AR activity, and hepatic, renal oxidative stress and function markers compared to DC group. Considerable amount of liver and muscle glycogen was replenished by SCE treatment in diabetic animals. Although metformin showed better effect, the activity of SCE was very much comparable with this drug. xxii The possible molecular mechanism behind the protective property of S. cochinchinensis against the insulin resistance in peripheral tissue as well as dyslipidemia in in vivo high fructose saturated fat diet model is described in Chapter 4. Initially animal were fed a high fructose saturated fat (HFS) diet for a period of 8 weeks to develop insulin resistance and dyslipidemia. The normal diet control (ND), ND + SCE 500 mg/kg bwd, high fructose saturated fat diet control (HFS), HFS + metformin 100 mg/kg bwd, HFS + SCE 250 and HFS + SCE 500 were the experimental groups. SCEs and metformin were administered daily for the next 3 weeks and sacrificed at the end of 11th week. At the end of week 11, HFS rats showed significantly abnormal glucose and insulin tolerance, HOMA-IR, % HbA1c, adiponectin, lipid profile, liver glycolytic and gluconeogenic enzyme activities, liver and muscle triglyceride accumulation compared to ND. HFS rats also exhibited increased level of plasma inflammatory cytokines, upregulated mRNA level of gluconeogenic and lipogenic genes in liver. HFS exhibited the increased expression of GLUT-2 in liver and decreased expression of GLUT-4 in muscle and adipose. SCE treatment also preserved the architecture of pancreas, liver, and kidney tissues. Treatment with SCE reversed the alterations of biochemical parameters, improved insulin sensitivity by modifying gene expression in liver, muscle and adipose tissues. Overall results suggest that SC mediates the antidiabetic activity mainly via alpha glucosidase inhibition, improved insulin sensitivity, with antiglycation and antioxidant activities.