973 resultados para sparse matrix technique


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In economic literature, information deficiencies and computational complexities have traditionally been solved through the aggregation of agents and institutions. In inputoutput modelling, researchers have been interested in the aggregation problem since the beginning of 1950s. Extending the conventional input-output aggregation approach to the social accounting matrix (SAM) models may help to identify the effects caused by the information problems and data deficiencies that usually appear in the SAM framework. This paper develops the theory of aggregation and applies it to the social accounting matrix model of multipliers. First, we define the concept of linear aggregation in a SAM database context. Second, we define the aggregated partitioned matrices of multipliers which are characteristic of the SAM approach. Third, we extend the analysis to other related concepts, such as aggregation bias and consistency in aggregation. Finally, we provide an illustrative example that shows the effects of aggregating a social accounting matrix model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Worm burdens recovered from inbred mice strains, namely C57Bl/6, C57Bl/10, CBA, BALB/c, DBA/2 and C3H/He, conventionally maintained in two institutional animal houses in the State of Rio de Janeiro, RJ, Brazil, were analyzed and compared, regarding their prevalences and mean intensities.Three parasite species were observed: the nematodes Aspiculuris tetraptera, Syphacia obvelata and the cestode Vampirolepis nana. A modification of the anal swab technique is also proposed for the first time as an auxiliary tool for the detection of oxyurid eggs in mice

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graph pebbling is a network model for studying whether or not a given supply of discrete pebbles can satisfy a given demand via pebbling moves. A pebbling move across an edge of a graph takes two pebbles from one endpoint and places one pebble at the other endpoint; the other pebble is lost in transit as a toll. It has been shown that deciding whether a supply can meet a demand on a graph is NP-complete. The pebbling number of a graph is the smallest t such that every supply of t pebbles can satisfy every demand of one pebble. Deciding if the pebbling number is at most k is NP 2 -complete. In this paper we develop a tool, called theWeight Function Lemma, for computing upper bounds and sometimes exact values for pebbling numbers with the assistance of linear optimization. With this tool we are able to calculate the pebbling numbers of much larger graphs than in previous algorithms, and much more quickly as well. We also obtain results for many families of graphs, in many cases by hand, with much simpler and remarkably shorter proofs than given in previously existing arguments (certificates typically of size at most the number of vertices times the maximum degree), especially for highly symmetric graphs. Here we apply theWeight Function Lemma to several specific graphs, including the Petersen, Lemke, 4th weak Bruhat, Lemke squared, and two random graphs, as well as to a number of infinite families of graphs, such as trees, cycles, graph powers of cycles, cubes, and some generalized Petersen and Coxeter graphs. This partly answers a question of Pachter, et al., by computing the pebbling exponent of cycles to within an asymptotically small range. It is conceivable that this method yields an approximation algorithm for graph pebbling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is becoming increasingly clear that the cell nucleus is a highly structurized organelle. Because of its tight compartmentalization, it is generally believed that a framework must exist, responsible for maintaining such a spatial organization. Over the last twenty years many investigations have been devoted to identifying the nuclear framework. Structures isolated by different techniques have been obtained in vitro and are variously referred to as nuclear matrix, nucleoskeleton or nuclear scaffold. Many different functions, such as DNA replication and repair, mRNA transcription, processing and transport have been described to occur in close association with these structures. However, there is still much debate as to whether or not any of these preparations corresponds to a nuclear framework that exists in vivo. In this article we summarize the most commonly-used methods for obtaining preparations of nuclear frameworks and we also stress the possible artifacts that can be created in vitro during the isolation procedures. Emphasis is placed also on the protein composition of the frameworks as well as on some possible signalling functions that have been recently described to occur in tight association with the nuclear matrix.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sixty eight patients with localized cutaneous leishmaniasis from an area with Leishmania (Viannia) braziliensis transmission had cultures performed with a modified Marzochi´s vacuum aspiratory puncture technique to establish sensitivity and contamination rate with this new method. Overall sensitivity of three aspirates was 47.1%; (CI95% 39.4; 59.4) significantly greater than the sensitivity of a single one aspirate. Fungal contamination was observed in 6/204 (2.9%) inoculated culture tubes. We recommend that this useful technique should be adopted as routine for primary isolation of L. (V.) braziliensis from localized cutaneous ulcers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Histological, ultrastructural, morphometric and immunohistochemical data obtained from the study of spleens removed by splenectomy from 34 patients with advanced hepatosplenic schistosomiasis revealed that the main alterations were congestive dilatation of the venous sinuses and diffuse thickening of the splenic cords. Splenic cord thickening was due to an increase of its matrix components, especially type IV collagen and laminin, with the conspicuous absence of interstitial collagens, either of type I or type III. Deposition of interstitial collagens (types I and III) occurred in scattered, small focal areas of the red pulp, but in the outside of the walls of the venous sinuses, in lymph follicles, marginal zone, in the vicinity of fibrous trabeculae and in sidero-sclerotic nodules. However, fibrosis was not a prominent change in schistosomal splenomegaly and thus the designation "fibro-congestive splenomegaly" seems inadequate. Lymph follicles exhibited variable degrees of atrophy, hyperplasia and fibrous replacement, sometimes all of them seen in different follicles of the same spleen and even in the same examined section. Changes in white pulp did not seem to greatly contribute to increasing spleen size and weight, when compared to the much more significant red pulp enlargement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Between 1985 and 1990 we treated 11 large segmental bone defects (average 6.7 cm) in ten patients with the Ilizarov technique. Open fractures, type III according to Gustilo, represented the largest group (8 of 11 cases). The average delay before the Ilizarov technique was initiated was 8.9 months. The external fixator was usually maintained for 1 year. Bone regeneration was obtained in every case. Consolidation was not fulfilled with this technique in three cases. The complications observed were one refracture, four leg-length discrepancies (average 1.5 cm), and five axial deformities exceeding 5 degrees. No pin-track infection was observed. In our limited series of four type IIIC open fractures treated by the Ilizarov technique, no patients required amputation. The Ilizarov technique is particularly useful in the treatment of large bone defects, without major complications, especially if there is an adequate initial debridement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomphalaria tenagophila, one of the intermediate hosts of the trematoda Schistosoma mansoni, is a simultaneous hermafrodite snail species. In order to analyse the genetic structure of these populations, we performed a double-stringency PCR technique to obtain genetic markers with microsatellites and arbitrary primers in a single reaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report on a consanguineous, Afghani family with two sisters affected with characteristic facial features, multiple contractures, progressive joint and skin laxity, hemorrhagic diathesis following minor trauma and multisystem fragility-related manifestations suggestive of a diagnosis of musculocontractural Ehlers-Danlos syndrome (EDS). This novel form of connective tissue disorder was recently reported in patients of Japanese, Turkish, and Indian descent who were formerly classified as having EDS type VIB and has now been recognized to be a part of spectrum including patients previously classified as having adducted thumb-clubfoot syndrome. We identified a previously unreported mutation in the CHST14 gene, which codes for the enzyme dermatan 4-O-sulfotransferase. We discuss the prenatal presentation, detailed clinical manifestations, and neurological findings in two sisters with this newly described musculocontractural EDS-CHST14 type. We demonstrate that fibroblasts from one of our patients produce more chondroitin sulfate than normal and show lower than normal deposition of collagens I and II and fibrillin 1-containing microfibrills. These findings suggest that the imbalance in the glycosaminoglycan content in developing tissues might interfere with normal deposition of other extracellular matrix components and ultimately contribute to the development of the phenotype observed in these patients. Furthermore, we ruled out the contribution of intrinsic platelet factors to the bleeding diathesis observed in some affected individuals. © 2012 Wiley Periodicals, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.