927 resultados para UNCITRAL Model Law on International Commercial Arbitration
Resumo:
The detection of Parkinson's disease (PD) in its preclinical stages prior to outright neurodegeneration is essential to the development of neuroprotective therapies and could reduce the number of misdiagnosed patients. However, early diagnosis is currently hampered by lack of reliable biomarkers. (1) H magnetic resonance spectroscopy (MRS) offers a noninvasive measure of brain metabolite levels that allows the identification of such potential biomarkers. This study aimed at using MRS on an ultrahigh field 14.1 T magnet to explore the striatal metabolic changes occurring in two different rat models of the disease. Rats lesioned by the injection of 6-hydroxydopamine (6-OHDA) in the medial-forebrain bundle were used to model a complete nigrostriatal lesion while a genetic model based on the nigral injection of an adeno-associated viral (AAV) vector coding for the human α-synuclein was used to model a progressive neurodegeneration and dopaminergic neuron dysfunction, thereby replicating conditions closer to early pathological stages of PD. MRS measurements in the striatum of the 6-OHDA rats revealed significant decreases in glutamate and N-acetyl-aspartate levels and a significant increase in GABA level in the ipsilateral hemisphere compared with the contralateral one, while the αSyn overexpressing rats showed a significant increase in the GABA striatal level only. Therefore, we conclude that MRS measurements of striatal GABA levels could allow for the detection of early nigrostriatal defects prior to outright neurodegeneration and, as such, offers great potential as a sensitive biomarker of presymptomatic PD.
Resumo:
This note describes how the Kalman filter can be modified to allow for thevector of observables to be a function of lagged variables without increasing the dimensionof the state vector in the filter. This is useful in applications where it is desirable to keepthe dimension of the state vector low. The modified filter and accompanying code (whichnests the standard filter) can be used to compute (i) the steady state Kalman filter (ii) thelog likelihood of a parameterized state space model conditional on a history of observables(iii) a smoothed estimate of latent state variables and (iv) a draw from the distribution oflatent states conditional on a history of observables.
Resumo:
Se presentan los primeros resultados del programa de investigación comparativo sobre las tres flotas pesqueras dedicadas a la extracción de anchoveta en el mar peruano (industrial de acero, industrial de madera y artesanal), así como sus cadenas de suministro hasta el abastecimiento del consumidor. El presente trabajo tiene por objetivo estudiar la sostenibilidad de las actividades involucradas en el suministro de proteínas, considerándose los impactos ambientales y los aspectos socio-económicos. Se realizó un esquema simple de un ecosistema pelágico de afloramiento y de los principales flujos de materia y energía, producto de la explotación humana. El esquema representa la situación peruana y muestra el alto nivel de antropización del sistema, debido al uso de energías fósiles, así como a la explotación y transformación tecnológica de recursos naturales terrestres (minerales, madera, etc.). Por otro lado, se muestra que la explotación del ecosistema marino peruano tiene repercusiones sobre el resto del planeta, debido a la exportación de harina y aceite de pescado destinados principalmente a actividades acuícolas. La flota anchovetera peruana se caracteriza por un amplio rango de tamaño de embarcaciones (de 2 a 600 t de capacidad de bodega); las de tamaño intermedio (30-100 t) son las más numerosas, pero las más grandes (>300 t) son las que acumulan el mayor poder de pesca. Los análisis sobre precios y distribución de la renta entre tripulantes y armadores muestran que, a pesar de que la mayor pesca de anchoveta es realizada por la flota industrial de acero, dedicada a la producción de harina y aceite de pescado y que tiene mayor eficiencia de captura por tripulante, la contribución de la pesca industrial de madera es significativa, pues genera mayor empleo por tonelada capturada y, posiblemente, no ocasiona mayor uso de energía. La pesca artesanal de anchoveta es la menos eficiente energéticamente y por tripulante, pero genera mucho más empleo por tonelada capturada; esta pesca representa menos del 3% de la producción total, del cual sólo una fracción va al consumo humano directo (CHD). Desde el año 2000, los precios de harina y aceite de pescado en los mercados internacionales se han incrementado, debido al aumento de la demanda asiática y al precio del combustible. Se debe estudiar en qué medida este aumento desfavorece el consumo interno de estos productos, así como el uso de anchoveta para CHD. Este análisis deberá ser validado y complementado con información de impacto ambiental; y podrían contribuir a la toma de decisión participativa, para un balance óptimo entre los tres segmentos de la flota y las cadenas de producción asociadas.
Resumo:
This paper examines whether the introduction of government consumptionexpenditure in a standard one good model of the international real businesscycle is sufficient to reconcile the theory with the existing pattern ofinternational consumption and output correlations. I calibrate the model totwo different pairs of countries and generate the simulated distribution ofconsumption and output correlations implied by several specifications of themodel. It is shown that the model can account for existing internationalconsumption correlations only under very specific assumptions about the sizeof effect of government expenditure on agents' utility or the variabilityof government expenditure shocks. Crucial parameters are identified and thesensitivity of the results discussed.
Resumo:
AIMS: While successful termination by pacing of organized atrial tachycardias has been observed in patients, single site rapid pacing has not yet led to conclusive results for the termination of atrial fibrillation (AF). The purpose of this study was to evaluate a novel atrial septal pacing algorithm for the termination of AF in a biophysical model of the human atria. METHODS AND RESULTS: Sustained AF was generated in a model based on human magnetic resonance images and membrane kinetics. Rapid pacing was applied from the septal area following a dual-stage scheme: (i) rapid pacing for 10-30 s at pacing intervals 62-70% of AF cycle length (AFCL), (ii) slow pacing for 1.5 s at 180% AFCL, initiated by a single stimulus at 130% AFCL. Atrial fibrillation termination success rates were computed. A mean success rate for AF termination of 10.2% was obtained for rapid septal pacing only. The addition of the slow pacing phase increased this rate to 20.2%. At an optimal pacing cycle length (64% AFCL) up to 29% of AF termination was observed. CONCLUSION: The proposed septal pacing algorithm could suppress AF reentries in a more robust way than classical single site rapid pacing. Experimental studies are now needed to determine whether similar termination mechanisms and rates can be observed in animals or humans, and in which types of AF this pacing strategy might be most effective.
Resumo:
BACKGROUND: Along the chromosome of the obligate intracellular bacteria Protochlamydia amoebophila UWE25, we recently described a genomic island Pam100G. It contains a tra unit likely involved in conjugative DNA transfer and lgrE, a 5.6-kb gene similar to five others of P. amoebophila: lgrA to lgrD, lgrF. We describe here the structure, regulation and evolution of these proteins termed LGRs since encoded by "Large G+C-Rich" genes. RESULTS: No homologs to the whole protein sequence of LGRs were found in other organisms. Phylogenetic analyses suggest that serial duplications producing the six LGRs occurred relatively recently and nucleotide usage analyses show that lgrB, lgrE and lgrF were relocated on the chromosome. The C-terminal part of LGRs is homologous to Leucine-Rich Repeats domains (LRRs). Defined by a cumulative alignment score, the 5 to 18 concatenated octacosapeptidic (28-meric) LRRs of LGRs present all a predicted alpha-helix conformation. Their closest homologs are the 28-residue RI-like LRRs of mammalian NODs and the 24-meres of some Ralstonia and Legionella proteins. Interestingly, lgrE, which is present on Pam100G like the tra operon, exhibits Pfam domains related to DNA metabolism. CONCLUSION: Comparison of the LRRs, enable us to propose a parsimonious evolutionary scenario of these domains driven by adjacent concatenations of LRRs. Our model established on bacterial LRRs can be challenged in eucaryotic proteins carrying less conserved LRRs, such as NOD proteins and Toll-like receptors.
Resumo:
The development of statistical models for forensic fingerprint identification purposes has been the subject of increasing research attention in recent years. This can be partly seen as a response to a number of commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. In addition, key forensic identification bodies such as ENFSI [1] and IAI [2] have recently endorsed and acknowledged the potential benefits of using statistical models as an important tool in support of the fingerprint identification process within the ACE-V framework. In this paper, we introduce a new Likelihood Ratio (LR) model based on Support Vector Machines (SVMs) trained with features discovered via morphometric and spatial analyses of corresponding minutiae configurations for both match and close non-match populations often found in AFIS candidate lists. Computed LR values are derived from a probabilistic framework based on SVMs that discover the intrinsic spatial differences of match and close non-match populations. Lastly, experimentation performed on a set of over 120,000 publicly available fingerprint images (mostly sourced from the National Institute of Standards and Technology (NIST) datasets) and a distortion set of approximately 40,000 images, is presented, illustrating that the proposed LR model is reliably guiding towards the right proposition in the identification assessment of match and close non-match populations. Results further indicate that the proposed model is a promising tool for fingerprint practitioners to use for analysing the spatial consistency of corresponding minutiae configurations.
Resumo:
This paper focuses on the switching behaviour of enrolees in the Swiss basic health insurance system. Even though the new Federal Law on Social Health Insurance (LAMal) was implemented in 1996 to promote competition among health insurers in basic insurance, there is limited evidence of premium convergence within cantons. This indicates that competition has not been effective so far, and reveals some inertia among consumers who seem reluctant to switch to less expensive funds. We investigate one possible barrier to switching behaviour, namely the influence of supplementary insurance. We use survey data on health plan choice (a sample of 1943 individuals whose switching behaviours were observed between 1997 and 2000) as well as administrative data relative to all insurance companies that operated in the 26 Swiss cantons between 1996 and 2005. The decision to switch and the decision to subscribe to a supplementary contract are jointly estimated.Our findings show that holding a supplementary insurance contract substantially decreases the propensity to switch. However, there is no negative impact of supplementary insurance on switching when the individual assesses his/her health as 'very good'. Our results give empirical support to one possible mechanism through which supplementary insurance might influence switching decisions: given that subscribing to basic and supplementary contracts with two different insurers may induce some administrative costs for the subscriber, holding supplementary insurance acts as a barrier to switch if customers who consider themselves 'bad risks' also believe that insurers reject applications for supplementary insurance on these grounds. In comparison with previous research, our main contribution is to offer a possible explanation for consumer inertia. Our analysis illustrates how consumer choice for one's basic health plan interacts with the decision to subscribe to supplementary insurance.
Resumo:
We present a non-equilibrium theory in a system with heat and radiative fluxes. The obtained expression for the entropy production is applied to a simple one-dimensional climate model based on the first law of thermodynamics. In the model, the dissipative fluxes are assumed to be independent variables, following the criteria of the Extended Irreversible Thermodynamics (BIT) that enlarges, in reference to the classical expression, the applicability of a macroscopic thermodynamic theory for systems far from equilibrium. We analyze the second differential of the classical and the generalized entropy as a criteria of stability of the steady states. Finally, the extreme state is obtained using variational techniques and observing that the system is close to the maximum dissipation rate
Resumo:
Els processos per a l’aprovació i la implementació de la Llei per a l’Autonomia personal i l’Atenció a les Persones en situació de Dependència (LAPAD) han donat lloc a un intens debat polític i social que, coincidint també amb les millores en la provisió de serveis i els avenços mèdics, ha contribuït a un procés de classificació i d’etiquetatge basats en els dèficits de les persones que es troben en aquestes circumstàncies. Aquesta visió anul·la el subjecte i la seva experiència singular i condiciona l’abordatge dels models d’atenció i de cura. L’estudi pretén fer una aproximació a les persones grans amb pèrdua d’autonomia funcional, fent emergir les seves veus, que expressen com perceben, interpreten, afronten i es reajusten a la nova situació. Partint d’un enfocament constructivista, basat en la subjectivitat, es fa un recorregut sobre els models de la discapacitat que han reeixit en l’activitat científica dels darrers anys, els mecanismes de regulació de les pèrdues que defensen les teories del cicle vital i les aportacions que s’han fet sobre el model de la resiliència aplicat a les persones que envelleixen. El resultats de l’estudi mostren com les representacions i els significats que les persones grans atribueixen a la seva experiència s’inscriuen en les seves trajectòries vitals, donant un sentit únic i singular a la forma de viure i de respondre a la pèrdua d’autonomia funcional i les seves conseqüències. Aquelles que expressen una vivència d’integritat respecte de la vida viscuda, amb predomini d’afectes positius envers un mateix i els altres, que conserven l’esperança i el desig de continuar vivint, s’ajusten a les pèrdues de manera més satisfactòria que aquelles que expressen desconfiança i una certa amargor respecte de la pròpia vida. D’això se’n deriva que els espais d’escolta i d’acompanyament poden ser un recurs vàlid i necessari en el qual, a través de la paraula i el testimoni narrat, el subjecte pugui repensar i resignificar les seves experiències.
Resumo:
Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.
Resumo:
La liberalización del transporte aéreo que se llevó a término en la Unión Europea a principios de los años noventa ha tenido efectos positivos sobre el bienestar del viajero. No obstante, existe un consenso en la literatura académica que estos efectos dependen de la existencia de una competencia efectiva en el nivel de la ruta. En este sentido, se plantea el problema que puede llegar a suponer las ventajas de escalera de las compañías dominantes en cada mercado interior. Además, se pretende capturar la diferenciación de productos como característica esencial de la industria del transporte aéreo. El análisis de estas cuestiones se realiza de la forma siguiente. En primer lugar, se hace referencia a los principales aspectos económicos que condicionan la competencia en el transporte aéreo. Y en segundo lugar, se implementa un modelo empírico basado en un sistema de tres ecuaciones, que se estima mediante la técnica de las variables instrumentales. La muestra utilizada hace referencia al año 2001 para la mayoría de las rutas del mercado interior español de vuelos regulares en dónde hay competencia. Los resultados de la estimación muestran la existencia de unas condiciones de competencia diferentes según el segmento del mercado al cual se dirigen las compañías aéreas. Efectivamente, la competencia en precios (calidad) parece ser predominante en el segmento de viajeros por motivos personales (negocios). Adicionalmente, el dominio que la compañía dominante tiene sobre la mayoría de las rutas parece descansar en las ventajas competitivas, tanto en términos de costes como en términos de demanda, que le proporciona el control de la red aeroportuaria nacional. De todo esto se puede inferir que el mantenimiento y/o aumento de los beneficios de la liberalización de los servicios de transporte aéreo exige extender la liberalización al uso del aeropuertos así como descentralizar su gestión.
Resumo:
In many practical applications the state of field soils is monitored by recording the evolution of temperature and soil moisture at discrete depths. We theoretically investigate the systematic errors that arise when mass and energy balances are computed directly from these measurements. We show that, even with no measurement or model errors, large residuals might result when finite difference approximations are used to compute fluxes and storage term. To calculate the limits set by the use of spatially discrete measurements on the accuracy of balance closure, we derive an analytical solution to estimate the residual on the basis of the two key parameters: the penetration depth and the distance between the measurements. When the thickness of the control layer for which the balance is computed is comparable to the penetration depth of the forcing (which depends on the thermal diffusivity and on the forcing period) large residuals arise. The residual is also very sensitive to the distance between the measurements, which requires accurately controlling the position of the sensors in field experiments. We also demonstrate that, for the same experimental setup, mass residuals are sensitively larger than the energy residuals due to the nonlinearity of the moisture transport equation. Our analysis suggests that a careful assessment of the systematic mass error introduced by the use of spatially discrete data is required before using fluxes and residuals computed directly from field measurements.
Resumo:
Toxicokinetic modeling is a useful tool to describe or predict the behavior of a chemical agent in the human or animal organism. A general model based on four compartments was developed in a previous study in order to quantify the effect of human variability on a wide range of biological exposure indicators. The aim of this study was to adapt this existing general toxicokinetic model to three organic solvents, which were methyl ethyl ketone, 1-methoxy-2-propanol and 1,1,1,-trichloroethane, and to take into account sex differences. We assessed in a previous human volunteer study the impact of sex on different biomarkers of exposure corresponding to the three organic solvents mentioned above. Results from that study suggested that not only physiological differences between men and women but also differences due to sex hormones levels could influence the toxicokinetics of the solvents. In fact the use of hormonal contraceptive had an effect on the urinary levels of several biomarkers, suggesting that exogenous sex hormones could influence CYP2E1 enzyme activity. These experimental data were used to calibrate the toxicokinetic models developed in this study. Our results showed that it was possible to use an existing general toxicokinetic model for other compounds. In fact, most of the simulation results showed good agreement with the experimental data obtained for the studied solvents, with a percentage of model predictions that lies within the 95% confidence interval varying from 44.4 to 90%. Results pointed out that for same exposure conditions, men and women can show important differences in urinary levels of biological indicators of exposure. Moreover, when running the models by simulating industrial working conditions, these differences could even be more pronounced. In conclusion, a general and simple toxicokinetic model, adapted for three well known organic solvents, allowed us to show that metabolic parameters can have an important impact on the urinary levels of the corresponding biomarkers. These observations give evidence of an interindividual variablity, an aspect that should have its place in the approaches for setting limits of occupational exposure.
Resumo:
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.