987 resultados para Polyharmonic order of precision
Resumo:
Abstract Part I : Background : Isolated lung perfusion (ILP) was designed for the treatment of loco-regional malignancies of the lung. In contrast to intravenous (IV) drug application, ILP allows for a selective administration of cytostatic agents such as doxorubicin to the lung while sparing non-affected tissues. However, the clinical results with ILP were disappointing. Doxorubicinbased ILP on sarcoma rodent lungs suggested high overall doxorubicin concentrations within the perfused lung but a poor penetration of the cytostatic agent into tumors. The same holds true for liposomal-encapsulated macromolecular doxorubicin (LiporubicinTM) In specific conditions, low-dose photodynamic therapy (PDT) can enhance the distribution of macromolecules across the endothelial bamer in solid tumors. It was recently postulated that tumor neovessels were more responsive to PDT than the normal vasculature. We therefore hypothesized that Visudyne®-mediated PDT could selectively increase liposomal doxorubicin (LiporubicinTM) uptake in sarcoma tumors to rodent lungs during intravenous (IV) drug administration and isolated lung perfusion (ILP). Material and Methods : A sarcoma tumor was generated in the left lung of Fisher rats by subpleural injection of a sarcoma cell ,suspension via thoracotomy. Ten days later, LiporubicinTM is administered IV or by single pass antegrade ILP, with or without Visudyne® -mediated low-dose PDT pre-treatment of the sarcoma bearing lung. The drug concentration and distribution were assessed separately in tumors and lung tissues by high pressure liquid chromatography (HPLC) and fluorescence microscopy (FNI~, respectively. Results : PDT pretreatment before IV LiporubicinTM administration resulted in a significantly higher tumor drug uptake and tumor to lung drug ratio compared to IV drug injection alone without affecting the blood flow and drug distribution in the lung. PDT pre-treatment before LiporubicinTM-based ILP also resulted in a higher tumor drug uptake and a higher tumor to lung drug ratio compared to ILP alone, however, these differences were not significant due to a heterogeneous blood flow drug distribution during ILP which was further accentuated by PDT. Conclusions : Low-dose Visudyne®-mediated PDT pre-treatment has the potential to selectively enhance liposomal encapsulated doxorubicin uptake in tumors but not in normal lung tissue after IV drug application in a rat model of sarcoma tumors to the lung which opens new perspectives for the treatment of superficially spreading chemoresistant tumors of the chest cavity such as mesothelioma or malignant effusion. However, the impact of PDT on macromolecular drug uptake during ILP is limited since its therapeutic advantage is circumvented by ILP-induced heterogeneicity of blood flow and drug distribution Abstract Part II Background : Photodynamic therapy (PDT) with Visudyne® acts by direct cellular phototoxicity and/or by an indirect vascular-mediated effect. Here, we demonstrate that the vessel integrity interruption by PDT can promote the extravasation of a macromolecular agent in normal tissue. To obtain extravasation in normal tissue PDT conditions were one order of magnitude more intensive than the ones in tissue containing neovessels reported in the literature. Material and Methods : Fluorescein isothiocyanate dextran (FITC-D, 2000kDa), a macromolecular agent, was intravenously injected 10 minutes before (LKO group, n=14) or 2 hours (LK2 group, n=16) after Visudyne® mediated PDT in nude mice bearing a dorsal skin fold chamber. Control animals had no PDT (CTRL group, n=8). The extravasation of FITC-D from blood vessels in striated muscle tissue was observed in both groups in real-time for up to 2500 seconds after injection. We also monitored PDT-induced leukocyte rolling in-vivo and assessed, by histology, the corresponding inflammatory reaction score in the dorsal skin fold chambers. Results : In all animals, at the applied PDT conditions, FITC-D extravasation was significantly enhanced in the PDT treated areas as compared to the surrounding non-treated areas (p<0.0001). There was no FITC-D leakage in the control animals. Animals from the LKO group had significantly less FITC-D extravasation than those from the LK2 group (p = 0.0002). In the LKO group FITC-D leakage correlated significantly with the inflammation (p < 0.001). Conclusions: At the selected conditions, Visudyne-mediated PDT promotes vascular leakage and FITC-D extravasation into the interstitial space of normal tissue. The intensity of vascular leakage depends on the time interval between PDT and FITC-D injection. This concept could be used to locally modulate the delivery of macromolecules in vivo. Résumé : La perfusion cytostatique isolée du poumon permet une administration sélective des agents cytostatiques sans implication de la circulation systémique avec une forte accumulation au niveau du poumon mais une faible pénétration dans les tumeurs. La thérapie photodynamique (PDT) qui consiste en l'application d'un sensibilisateur activé par lumière laser non- thermique d'une longueur d'onde définie permet dans certaines conditions, une augmentation de la pénétration des agents cytostatiques macromoléculaires à travers la barrière endothéliale tumorale. Nous avons exploré cet avantage thérapeutique de la PDT dans un modèle expérimental afin d'augmenter d'une manière sélective la pénétration tumorale de la doxorubicin pegylée, liposomal- encapsulée macromoléculaire (Liporubicin). Une tumeur sarcomateuse a été générée au niveau du poumon de rongeur suivie d'administration de Liporubicin, soit par voie intraveineuse soit par perfusion isolée du poumon (ILP). Une partie des animaux ont reçus un prétraitement de la tumeur et du poumon sous jacent par PDT avec Visudyne comme photosensibilisateur. Les résultats ont démontrés que la PDT permet, sous certaines conditions, une augmentation sélective de Liporubicin dans les tumeurs mais pas dans le parenchyme pulmonaire sous jacent. Après administration intraveineuse de Liporubicin et prétraitement par PDT, l'accumulation dans les tumeurs était significative par rapport au poumon, et aux tumeurs sans PDT. Le même phénomène est observé après ILP du poumon. Cependant, les différences avec ou sans PDT n'étaient pas significatives lié à und distribution hétérogène de Liporubicin dans le poumon perfusé après ILP. Dans une deuxième partie de l'expérimentation, nous avons exploré la microscopie intra-vitale pour déterminer l'extravasion des substances macromoléculaires (FITS) à travers la barrière endothéliale avec ou sans Visudyne-PDT au niveau des chambres dorsales des souris nues. Les résultats montrent qu'après PDT, l'extravasion de FITS a été augmentée de manière significative par rapport au tissu non traité. L'intensité de l'extravasion de FITS dépendait également de l'intervalle entre PDT et injection de FITS. En conclusion, les expérimentations montrent que la PDT est capable, sous certaines conditions, d'augmenter de manière significative l'extravasion des macromolécules à travers la barrière endothéliale et leur accumulation dans des tumeurs mais pas dans le parenchyme pulmonaire. Ces résultats permettent une nouvelle perspective de traitement pour des tumeurs superficielles intrathoraciques chimio-résistent comme l'épanchement pleural malin ou le mésothéliome pleural.
Resumo:
We test the real interest rate parity hypothesis using data for the G7 countries over the period 1970-2008. Our contribution is two-fold. First, we utilize the ARDL bounds approach of Pesaran et al. (2001) which allows us to overcome uncertainty about the order of integration of real interest rates. Second, we test for structural breaks in the underlying relationship using the multiple structural breaks test of Bai and Perron (1998, 2003). Our results indicate significant parameter instability and suggest that, despite the advances in economic and financial integration, real interest rate parity has not fully recovered from a breakdown in the 1980s.
Resumo:
In line with global changes, the UK regulatory regime for audit and corporate governance has changed significantly since the Enron scandal, with an increased role for audit committees and independent inspection of audit firms. UK listed company chief financial officers (CFOs), audit committee chairs (ACCs) and audit partners (APs) were surveyed in 2007 to obtain views on the impact of 36 economic and regulatory factors on audit quality. 498 usable responses were received, representing a response rate of 36%. All groups rated various audit committee interactions with auditors among the factors most enhancing audit quality. Exploratory factor analysis reduces the 36 factors to nine uncorrelated dimensions. In order of extraction, these are: economic risk; audit committee activities; risk of regulatory action; audit firm ethics; economic independence of auditor; audit partner rotation; risk of client loss; audit firm size; and, lastly, International Standards on Auditing (ISAs) and audit inspection. In addition to the activities of the audit committee, risk factors for the auditor (both economic and certain regulatory risks) are believed to most enhance audit quality. However, ISAs and the audit inspection regime, aspects of the ‘standards-surveillance compliance’ regulatory system, are viewed as less effective. Respondents commented that aspects of the changed regime are largely process and compliance driven, with high costs for limited benefits, supporting psychological bias regulation theory that claims there is overconfidence that a useful regulatory intervention exists.
Resumo:
We re-examine the dynamics of returns and dividend growth within the present-value framework of stock prices. We find that the finite sample order of integration of returns is approximately equal to the order of integration of the first-differenced price-dividend ratio. As such, the traditional return forecasting regressions based on the price-dividend ratio are invalid. Moreover, the nonstationary long memory behaviour of the price-dividend ratio induces antipersistence in returns. This suggests that expected returns should be modelled as an AFIRMA process and we show this improves the forecast ability of the present-value model in-sample and out-of-sample.
Resumo:
Recently, Revil & Florsch proposed a novel mechanistic model based on the polarization of the Stern layer relating the permeability of granular media to their spectral induced polarization (SIP) characteristics based on the formation of polarized cells around individual grains. To explore the practical validity of this model, we compare it to pertinent laboratory measurements on samples of quartz sands with a wide range of granulometric characteristics. In particular, we measure the hydraulic and SIP characteristics of all samples both in their loose, non-compacted and compacted states, which might allow for the detection of polarization processes that are independent of the grain size. We first verify the underlying grain size/permeability relationship upon which the model of Revil & Florsch is based and then proceed to compare the observed and predicted permeability values for our samples by substituting the grain size characteristics by corresponding SIP parameters, notably the so-called Cole-Cole time constant. In doing so, we also asses the quantitative impact of an observed shift in the Cole-Cole time constant related to textural variations in the samples and observe that changes related to the compaction of the samples are not relevant for the corresponding permeability predictions. We find that the proposed model does indeed provide an adequate prediction of the overall trend of the observed permeability values, but underestimates their actual values by approximately one order-of-magnitude. This discrepancy in turn points to the potential importance of phenomena, which are currently not accounted for in the model and which tend to reduce the characteristic size of the prevailing polarization cells compared to the considered model, such as, for example, membrane polarization, contacts of double-layers of neighbouring grains, and incorrect estimation of the size of the polarized cells because of the irregularity of natural sand grains.
Resumo:
This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.
Resumo:
Recent technological advances in remote sensing have enabled investigation of the morphodynamics and hydrodynamics of large rivers. However, measuring topography and flow in these very large rivers is time consuming and thus often constrains the spatial resolution and reach-length scales that can be monitored. Similar constraints exist for computational fluid dynamics (CFD) studies of large rivers, requiring maximization of mesh-or grid-cell dimensions and implying a reduction in the representation of bedform-roughness elements that are of the order of a model grid cell or less, even if they are represented in available topographic data. These ``subgrid'' elements must be parameterized, and this paper applies and considers the impact of roughness-length treatments that include the effect of bed roughness due to ``unmeasured'' topography. CFD predictions were found to be sensitive to the roughness-length specification. Model optimization was based on acoustic Doppler current profiler measurements and estimates of the water surface slope for a variety of roughness lengths. This proved difficult as the metrics used to assess optimal model performance diverged due to the effects of large bedforms that are not well parameterized in roughness-length treatments. However, the general spatial flow patterns are effectively predicted by the model. Changes in roughness length were shown to have a major impact upon flow routing at the channel scale. The results also indicate an absence of secondary flow circulation cells in the reached studied, and suggest simpler two-dimensional models may have great utility in the investigation of flow within large rivers. Citation: Sandbach, S. D. et al. (2012), Application of a roughness-length representation to parameterize energy loss in 3-D numerical simulations of large rivers, Water Resour. Res., 48, W12501, doi: 10.1029/2011WR011284.
Resumo:
The epithelial amiloride-sensitive sodium channel (ENaC) controls transepithelial Na+ movement in Na(+)-transporting epithelia and is associated with Liddle syndrome, an autosomal dominant form of salt-sensitive hypertension. Detailed analysis of ENaC channel properties and the functional consequences of mutations causing Liddle syndrome has been, so far, limited by lack of a method allowing specific and quantitative detection of cell-surface-expressed ENaC. We have developed a quantitative assay based on the binding of 125I-labeled M2 anti-FLAG monoclonal antibody (M2Ab*) directed against a FLAG reporter epitope introduced in the extracellular loop of each of the alpha, beta, and gamma ENaC subunits. Insertion of the FLAG epitope into ENaC sequences did not change its functional and pharmacological properties. The binding specificity and affinity (Kd = 3 nM) allowed us to correlate in individual Xenopus oocytes the macroscopic amiloride-sensitive sodium current (INa) with the number of ENaC wild-type and mutant subunits expressed at the cell surface. These experiments demonstrate that: (i) only heteromultimeric channels made of alpha, beta, and gamma ENaC subunits are maximally and efficiently expressed at the cell surface; (ii) the overall ENaC open probability is one order of magnitude lower than previously observed in single-channel recordings; (iii) the mutation causing Liddle syndrome (beta R564stop) enhances channel activity by two mechanisms, i.e., by increasing ENaC cell surface expression and by changing channel open probability. This quantitative approach provides new insights on the molecular mechanisms underlying one form of salt-sensitive hypertension.
Resumo:
As pyrethroids are presently the favored group of insecticides to control triatomines, we performed a series of bioassays to determine the intrinsic activity of some of the main compounds used in the control campaigns, against five of the main species of triatomines to be controlled. Comparing the insecticides it can be seen that lambdacyhalothrin is more effective than the other three pyrethroids, both considering the LD50 and 99 for all the three species with comparable results. On Triatoma infestans the LD50 of lambdacyhalothrin was followed by that of alfacypermethrin, cyfluthrin and deltamethrin. On Rhodnius prolixus the sequence, in decreasing order of activity, was lambdacyhalothrin, alfacypermethrin, deltamethrin and cyfluthrin. Some modifications can be seen when we compare the LD99, that has more to see to what happens in the field. T. brasiliensis showed to be as sensible to lambdacyhalothrin as T. infestans, the most susceptible for this product. By the other side T. sordida is the least susceptible considering the LD99 of this insecticide.
Resumo:
The impact of biocontrol strain Pseudomonas fluorescens CHA0 and of its genetically modified, antibiotic-overproducing derivative CHA0/pME3424 on a reconstructed population of the plant-beneficial Sinorhizobium meliloti bacteria was assessed in gnotobiotic systems. In sterile soil, the final density of the reconstructed S. meliloti population decreased by more than one order of magnitude in the presence of either of the Pseudomonas strains when compared to a control without addition of P. fluorescens. Moreover, there was a change in the proportion of each individual S. meliloti strain within the population. Plant tests also revealed changes in the nodulating S. meliloti population in the presence of strains CHA0 or CHA0/pME3424. In both treatments one S. meliloti strain, f43, was significantly reduced in its root nodule occupancy. Analysis of alfalfa yields showed a slight but statistically significant increase in shoot dry weight when strain CHA0 was added to the reconstructed S. meliloti population whereas no such effect was observed with CHA0/pME3424.
Resumo:
Although the sensitivity to light of thioridazine and its metabolites has been described, the problem does not seem to be widely acknowledged. Indeed, a survey of the literature shows that assays of these compounds under light-protected conditions have been performed only in a few of the numerous analytical studies on this drug. In the present study, thioridazine, its metabolites, and 18 other neuroleptics were tested for their sensitivity to light under conditions used for their analysis. The results show that light significantly affects the analysis of thioridazine and its metabolites. It readily causes the racemization of the isomeric pairs of thioridazine 5-sulphoxide and greatly decreases the concentration of thioridazine. This sensitivity to light varied with the medium used (most sensitive in acidic media) and also with the molecule (in order of decreasing sensitivity: thioridazine > mesoridazine > sulforidazine). Degradation in neutral or basic media was slow, with the exception of mesoridazine in a neutral medium. Twelve other phenothiazines tested, as well as chlorprotixene, a thioxanthene drug, were found to be sensitive to light in acidic media, whereas flupenthixol and zuclopenthixol (two thioxanthenes), clozapine, fluperlapine, and haloperidol (a butyrophenone) did not seem to be affected. In addition to being sensitive to light, some compounds may be readily oxidized by peroxide-containing solvents.
Resumo:
Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.
Resumo:
Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.
Resumo:
A recently developed technique, namely multiple beam interference microscopy, has been applied to investigate the morphology of the parasite Toxoplasma gondii for the first time. The interference pattern obtained from the multiple internal reflection of a T. gondii, sandwiched between a glass plate and a cover plate, was focused on the objective of a conventional microscope. Because of the enhance contrast, several details of sub cellular structure and separating compartments are clearly visible. Details reveal the presence of a nucleus, lipid body, dense granule, rhoptry and amylopectin. The wall thickness of the membrane of the lipid body and the amylopectin is of the order of 0.02 µm and can be clearly distinguished with the help of the present technique. The same parasite has also been examined with the help of atomic force microscopy, and because of its thick membrane, the inner structural details were not observed at all. Sub cellular details of T. gondii observed with the present technique have been reported earlier only by low amplification transmission electron microscopy and not by any optical microscopic technique.
Resumo:
Gim & Kim (1998) proposed a generalization of Jeong (1982, 1984) reinterpretation of the Hawkins-Simon condition for macroeconomic stability to off-diagonal matrix elements. This generalization is conceptually relevant for it offers a complementary view of interindustry linkages beyond final or net output influence. The extension is completely similar to the 'total flow' idea introduced by Szyrmer (1992) or the 'output-to-output' multiplier of Miller & Blair (2009). However the practical implementation of Gim & Kim is actually faulty since it confuses the appropriate order of output normalization. We provide a new and elementary solution for the correct formalization using standard interindustry accounting concepts.