726 resultados para visualize
Resumo:
The effects of the thyroid hormones on target cells are mediated through nuclear T3 receptors. In the peripheral nervous system, nuclear T3 receptors were previously detected with the monoclonal antibody 2B3 mAb in all the primary sensory neurons throughout neuronal life and in peripheral glia at the perinatal period only (Eur. J. Neurosci. 5, 319, 1993). To determine whether these nuclear T3 receptors correspond to functional ones able to bind T3, cryostat sections and in vitro cell cultures of dorsal root ganglion (DRG) or sciatic nerve were incubated with 0.1 nM [125I]-labeled T3, either alone to visualize the total T3-binding sites or added with a 10(3) fold excess of unlabeled T3 to estimate the part due to the non-specific T3-binding. After glutaraldehyde fixation, radioautography showed that the specific T3-binding sites were largely prevalent. The T3-binding capacity of peripheral glia in DRG and sciatic nerve was restricted to the perinatal period in vivo and to Schwann cells cultured in vitro. In all the primary sensory neurons, specific T3-binding sites were disclosed in foetal as well as adult rats. The detection of the T3-binding sites in the nucleus indicated that the nuclear T3 receptors are functional. Moreover the concomitant presence of both T3-binding sites and T3 receptors alpha isoforms in the perikaryon of DRG neurons infers that: 1) [125I]-labeled T3 can be retained on the T3-binding 'E' domain of nascent alpha 1 isoform molecules newly-synthesized on the perikaryal ribosomes; 2) the alpha isoforms translocated to the nucleus are modified by posttranslational changes and finally recognized by 2B3 mAb as nuclear T3 receptor. In conclusion, the radioautographic visualization of the T3-binding sites in peripheral neurons and glia confirms that the nuclear T3 receptors are functional and contributes to clarify the discordant intracellular localization provided by the immunocytochemical detection of nuclear T3 receptors and T3 receptor alpha isoforms.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
The graphical representation of spatial soil properties in a digital environment is complex because it requires a conversion of data collected in a discrete form onto a continuous surface. The objective of this study was to apply three-dimension techniques of interpolation and visualization on soil texture and fertility properties and establish relationships with pedogenetic factors and processes in a slope area. The GRASS Geographic Information System was used to generate three-dimensional models and ParaView software to visualize soil volumes. Samples of the A, AB, BA, and B horizons were collected in a regular 122-point grid in an area of 13 ha, in Pinhais, PR, in southern Brazil. Geoprocessing and graphic computing techniques were effective in identifying and delimiting soil volumes of distinct ranges of fertility properties confined within the soil matrix. Both three-dimensional interpolation and the visualization tool facilitated interpretation in a continuous space (volumes) of the cause-effect relationships between soil texture and fertility properties and pedological factors and processes, such as higher clay contents following the drainage lines of the area. The flattest part with more weathered soils (Oxisols) had the highest pH values and lower Al3+ concentrations. These techniques of data interpolation and visualization have great potential for use in diverse areas of soil science, such as identification of soil volumes occurring side-by-side but that exhibit different physical, chemical, and mineralogical conditions for plant root growth, and monitoring of plumes of organic and inorganic pollutants in soils and sediments, among other applications. The methodological details for interpolation and a three-dimensional view of soil data are presented here.
Resumo:
Performing a post-mortem multidetector CT (MDCT) scan has already become routine in some institutes of forensic medicine. To better visualize the vascular system, different techniques of post-mortem CT-angiography have been explored, which can essentially be divided into partial- and whole-body angiography techniques. Probably the most frequently applied technique today is the so-called multiphase post-mortem CT-angiography (MPMCTA) a standardized method for investigating the vessels of the head, thorax and abdomen. Different studies exist, describing its use for medicolegal investigations, and its advantages as well as its artefacts and pitfalls. With the aim to investigate the performance of PMCTA and to develop and validate techniques, an international working group was created in 2012 called the "Technical Working Group Post-mortem Angiography Methods" (TWGPAM). Beyond its primary perspective, the goals of this group include creating recommendations for the indication of the investigation and for the interpretation of the images and to distribute knowledge about PMCTA. This article provides an overview about the different approaches that have been developed and tested in recent years and an update about ongoing research in this field. It will explain the technique of MPMCTA in detail and give an outline of its indications, application, advantages and limitations.
Resumo:
Investigation of violent death, especially cases of sharp trauma and gunshot, is an important part of medico-legal investigations. Beside the execution of a conventional autopsy, the performance of a post-mortem Multi-Detector Computed Tomography (MDCT)-scan has become a highly appreciated tool. In order to investigate also the vascular system, post-mortem CT-angiography has been introduced. The most studied and widespread technique is the Multi-phase post-mortem CT-angiography (MPMCTA). Its sensitivity to detect vascular lesions is even superior to conventional autopsy. The application of MPMCTA for cases of gunshot and sharp-trauma is therefore an obvious choice, as vascular lesions are common in such victims. In most cases of sharp trauma and in several cases of gunshots, death can be attributed to exsanguinations. MPMCTA is able to detect the exact source of bleeding and also to visualize trajectories, which are of most importance in these cases. The reconstructed images allow to clearly visualizing the trajectory in a way that is easily comprehensible for not medically trained legal professionals. The sensitivity of MPMCTA for soft tissue and organ lesions approximately matches the sensitivity of conventional autopsy. However, special care, experience and effective use of the imaging software is necessary for performing the reconstructions of the trajectory. Large volume consuming haemorrhages and shift of inner organs are sources of errors and misinterpretations. This presentation shall give an overview about the advantages and limitations of the use of MPMCTA for investigating cases of gunshot and sharp-trauma.
Resumo:
Higher risk for long-term behavioral and emotional sequelae, with attentional problems (with or without hyperactivity) is now becoming one of the hallmarks of extreme premature (EP) birth and birth after pregancy conditions leading to poor intra uterine growth restriction (IUGR) [1,2]. However, little is know so far about the neurostructural basis of these complexe brain functional abnormalities that seem to have their origins in early critical periods of brain development. The development of cortical axonal pathways happens in a series of sequential events. The preterm phase (24-36 post conecptional weeks PCW) is known for being crucial for growth of the thalamocortical fiber bundles as well as for the development of long projectional, commisural and projectional fibers [3]. Is it logical to expect, thus, that being exposed to altered intrauterine environment (altered nutrition) or to extrauterine environment earlier that expected, lead to alterations in the structural organization and, consequently, alter the underlying white matter (WM) structure. Understanding rate and variability of normal brain development, and detect differences from typical development may offer insight into the neurodevelopmental anomalies that can be imaged at later stages. Due to its unique ability to non-invasively visualize and quantify in vivo white matter tracts in the brain, in this study we used diffusion MRI (dMRI) tractography to derive brain graphs [4,5,6]. This relatively simple way of modeling the brain enable us to use graph theory to study topological properties of brain graphs in order to study the effects of EP and IUGR on childrens brain connectivity at age 6 years old.
Resumo:
Current limitations of coronary magnetic resonance angiography (MRA) include a suboptimal signal-to-noise ratio (SNR), which limits spatial resolution and the ability to visualize distal and branch vessel coronary segments. Improved SNR is expected at higher field strengths, which may provide improved spatial resolution. However, a number of potential adverse effects on image quality have been reported at higher field strengths. The limited availability of high-field systems equipped with cardiac-specific hardware and software has previously precluded successful in vivo human high-field coronary MRA data acquisition. In the present study we investigated the feasibility of human coronary MRA at 3.0 T in vivo. The first results obtained in nine healthy adult subjects are presented.
Resumo:
Introduction: Accurate and reproducible tibial tunnel placement minimizing the risk of neurovascular damage is a crucial condition for successful arthroscopic reconstruction of the posterior cruciate ligament (PCL). This step is commonly performed under fluoroscopic control. Hypothesis: Performing the tibial tunnel under exclusive arthroscopic control allows accurate and reliable tunnel placement according to recommendations in the literature. Materials and Methods: Between February 2007 and December 2009, 108 arthroscopic single bundle PCL reconstructions in tibial tunnel technique were performed. The routine postoperative radiographs were screened according to previously defined quality criterions. After critical analysis, the radiographs of 48 patients (48 knees) were enrolled in the study. 10 patients had simultaneous ACL reconstruction and 7 had PCL revision surgery. The tibial tunnel was placed under direct arthroscopic control through a posteromedial portal using a standard tibial aming device. Key anatomical landmarks were the exposed tibial insertion of the PCL and the posterior horn of the medial meniscus. First, the centre of the posterior tibial tunnel outlet on the a-p view was determined by digital analysis of the postoperative radiographes. Its distance to the medial tibial spine was measured parallel to the tibia plateau. The mediolateral position was expressed by the ratio between the distance of the tunnel outlet to the medial border and the total width of the tibial plateau. On the lateral view the vertical tunnel position was measured perpendicularly to a tangent of the medial tibial plateau. All measurement were repeated at least twice and carried out by two examiners. Results: The mean mediolateral tunnel position was 49.3 ± 4.6% (ratio), 6.7 ± 3.6 mm lateral to the medial tibial spine. On the lateral view the tunnel centre was 10.1 ± 4.5 mm distal to the bony surface of the medial tibial plateau. Neurovascular damage was observed in none of our patients. Conclusion: The results of this radiological study confirm that exclusive arthroscopic control for tibial tunnel placement in PCL reconstruction yields reproducible and accurate results according to the literature. Our technique avoids radiation, facilitates the operation room setting and enables the surgeon to visualize the anatomic key landmarks for tibial tunnel placement.
Resumo:
Although the physiological and pharmacological evidences suggest a role for angiotensin II (Ang II) with the mammalian heart, the source and precise location of Ang II are unknown. To visualize and quantitate Ang II in atria, ventricular walls and interventricular septum of the rat and human heart and to explore the feasibility of local Ang II production and function, we investigated by different methods the expression of proteins involved in the generation and function of Ang II. We found mRNA of angiotensinogen (Ang-N), of angiotensin converting enzyme, of the angiotensin type receptors AT(1A) and AT(2) (AT(1B) not detected) as well as of cathepsin D in any part of the hearts. No renin mRNA was traceable. Ang-N mRNA was visualized by in situ hybridization in atrial ganglial neurons. Ang II and dopamine-β-hydroxylase (DβH) were either colocalized inside the same neuronal cell or the neurons were specialized for Ang II or DβH. Within these neurons, the vesicular acetylcholine transporter (VAChT) was neither colocalized with Ang II nor DβH, but VAChT-staining was found with synapses en passant encircle these neuronal cells. The fibers containing Ang II exhibited with blood vessels and with cardiomyocytes supposedly angiotensinergic synapses en passant. In rat heart, right atrial median Ang II concentration appeared higher than septal and ventricular Ang II. The distinct colocalization of neuronal Ang II with DβH in the heart may indicate that Ang II participates together with norepinephrine in the regulation of cardiac functions: Produced as a cardiac neurotransmitter Ang II may have inotropic, chronotropic or dromotropic effects in atria and ventricles and contributes to blood pressure regulation.
Resumo:
This paper describes methods to analyze the brain's electric fields recorded with multichannel Electroencephalogram (EEG) and demonstrates their implementation in the software CARTOOL. It focuses on the analysis of the spatial properties of these fields and on quantitative assessment of changes of field topographies across time, experimental conditions, or populations. Topographic analyses are advantageous because they are reference independents and thus render statistically unambiguous results. Neurophysiologically, differences in topography directly indicate changes in the configuration of the active neuronal sources in the brain. We describe global measures of field strength and field similarities, temporal segmentation based on topographic variations, topographic analysis in the frequency domain, topographic statistical analysis, and source imaging based on distributed inverse solutions. All analysis methods are implemented in a freely available academic software package called CARTOOL. Besides providing these analysis tools, CARTOOL is particularly designed to visualize the data and the analysis results using 3-dimensional display routines that allow rapid manipulation and animation of 3D images. CARTOOL therefore is a helpful tool for researchers as well as for clinicians to interpret multichannel EEG and evoked potentials in a global, comprehensive, and unambiguous way.
Resumo:
PURPOSE: Atherosclerosis results in a considerable medical and socioeconomic impact on society. We sought to evaluate novel magnetic resonance imaging (MRI) angiography and vessel wall sequences to visualize and quantify different morphologic stages of atherosclerosis in a Watanabe hereditary hyperlipidemic (WHHL) rabbit model. MATERIAL AND METHODS: Aortic 3D steady-state free precession angiography and subrenal aortic 3D black-blood fast spin-echo vessel wall imaging pre- and post-Gadolinium (Gd) was performed in 14 WHHL rabbits (3 normal, 6 high-cholesterol diet, and 5 high-cholesterol diet plus endothelial denudation) on a commercial 1.5 T MR system. Angiographic lumen diameter, vessel wall thickness, signal-/contrast-to-noise analysis, total vessel area, lumen area, and vessel wall area were analyzed semiautomatically. RESULTS: Pre-Gd, both lumen and wall dimensions (total vessel area, lumen area, vessel wall area) of group 2 + 3 were significantly increased when compared with those of group 1 (all P < 0.01). Group 3 animals had significantly thicker vessel walls than groups 1 and 2 (P < 0.01), whereas angiographic lumen diameter was comparable among all groups. Post-Gd, only diseased animals of groups 2 + 3 showed a significant (>100%) signal-to-noise ratio and contrast-to-noise increase. CONCLUSIONS: A combination of novel 3D magnetic resonance angiography and high-resolution 3D vessel wall MRI enabled quantitative characterization of various atherosclerotic stages including positive arterial remodeling and Gd uptake in a WHHL rabbit model using a commercially available 1.5 T MRI system.
Resumo:
PURPOSE: To describe the use of anterior segment optical coherence tomography (AS-OCT) to clarify the position and patency of aqueous shunt devices in the anterior chamber of eyes where corneal edema or tube position does not permit a satisfactory view. DESIGN: Noncomparative observational case series. METHODS: Four cases are reported in which aqueous shunt malposition or obstruction was suspected but the shunt could not be seen on clinical examination. The patients underwent AS-OCT to identify the position and patency of the shunt tip. RESULTS: In each case, AS-OCT provided data regarding tube position and/or patency that could not be obtained by slit-lamp examination or by gonioscopy that influenced management. CONCLUSIONS: AS-OCT can be used to visualize anterior chamber tubes in the presence of corneal edema that precludes an adequate view or in cases where the tube is retracted into the cornea. In such cases, AS-OCT is useful in identifying shunt patency and position, which helps guide clinical decision making.
Resumo:
BACKGROUND: The goal of this study was to characterize the performance of fluorine-19 ((19)F) cardiac magnetic resonance (CMR) for the specific detection of inflammatory cells in a mouse model of myocarditis. Intravenously administered perfluorocarbons are taken up by infiltrating inflammatory cells and can be detected by (19)F-CMR. (19)F-labeled cells should, therefore, generate an exclusive signal at the inflamed regions within the myocardium. METHODS AND RESULTS: Experimental autoimmune myocarditis was induced in BALB/c mice. After intravenous injection of 2×200 µL of a perfluorocarbon on day 19 and 20 (n=9) after immunization, in vivo (19)F-CMR was performed at the peak of myocardial inflammation (day 21). In 5 additional animals, perfluorocarbon combined with FITC (fluorescein isothiocyanate) was administered for postmortem immunofluorescence and flow-cytometry analyses. Control experiments were performed in 9 animals. In vivo (19)F-CMR detected myocardial inflammation in all experimental autoimmune myocarditis-positive animals. Its resolution was sufficient to identify even small inflammatory foci, that is, at the surface of the right ventricle. Postmortem immunohistochemistry and flow cytometry confirmed the presence of perfluorocarbon in macrophages, dendritic cells, and granulocytes, but not in lymphocytes. The myocardial volume of elevated (19)F signal (rs=0.96; P<0.001), the (19)F signal-to-noise ratio (rs=0.92; P<0.001), and the (19)F signal integral (rs=0.96; P<0.001) at day 21 correlated with the histological myocarditis severity score. CONCLUSIONS: In vivo (19)F-CMR was successfully used to visualize the inflammation specifically and robustly in experimental autoimmune myocarditis, and thus allowed for an unprecedented insight into the involvement of inflammatory cells in the disease process.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
A headspace solid-phase microextraction procedure (HS-SPME) was developed for the profiling of traces present in 3,4-methylenedioxymethylampethamine (MDMA). Traces were first extracted using HS-SPME and then analyzed by gas chromatography-mass spectroscopy (GC-MS). The HS-SPME conditions were optimized using varying conditions. Optimal results were obtained when 40 mg of crushed MDMA sample was heated at 80 °C for 15 min, followed by extraction at 80 °C for 15 min with a polydimethylsiloxane/divinylbenzene coated fibre. A total of 31 compounds were identified as traces related to MDMA synthesis, namely precursors, intermediates or by-products. In addition some fatty acids used as tabletting materials and caffeine used as adulterant, were also detected. The use of a restricted set of 10 target compounds was also proposed for developing a screening tool for clustering samples having close profile. 114 seizures were analyzed using an SPME auto-sampler (MultiPurpose Samples MPS2), purchased from Gerstel GMBH & Co. (Germany), and coupled to GC-MS. The data was handled using various pre-treatment methods, followed by the study of similarities between sample pairs based on the Pearson correlation. The results show that HS-SPME, coupled with the suitable statistical method is a powerful tool for distinguishing specimens coming from the same seizure and specimens coming from different seizures. This information can be used by law enforcement personnel to visualize the ecstasy distribution network as well as the clandestine tablet manufacturing.