881 resultados para Large-scale Analysis
Resumo:
Stylolites are rough paired surfaces, indicative of localized stress-induced dissolution under a non-hydrostatic state of stress, separated by a clay parting which is believed to be the residuum of the dissolved rock. These structures are the most frequent deformation pattern in monomineralic rocks and thus provide important information about low temperature deformation and mass transfer. The intriguing roughness of stylolites can be used to assess amount of volume loss and paleo-stress directions, and to infer the destabilizing processes during pressure solution. But there is little agreement on how stylolites form and why these localized pressure solution patterns develop their characteristic roughness.rnNatural bedding parallel and vertical stylolites were studied in this work to obtain a quantitative description of the stylolite roughness and understand the governing processes during their formation. Adapting scaling approaches based on fractal principles it is demonstrated that stylolites show two self affine scaling regimes with roughness exponents of 1.1 and 0.5 for small and large length scales separated by a crossover length at the millimeter scale. Analysis of stylolites from various depths proved that this crossover length is a function of the stress field during formation, as analytically predicted. For bedding parallel stylolites the crossover length is a function of the normal stress on the interface, but vertical stylolites show a clear in-plane anisotropy of the crossover length owing to the fact that the in-plane stresses (σ2 and σ3) are dissimilar. Therefore stylolite roughness contains a signature of the stress field during formation.rnTo address the origin of stylolite roughness a combined microstructural (SEM/EBSD) and numerical approach is employed. Microstructural investigations of natural stylolites in limestones reveal that heterogeneities initially present in the host rock (clay particles, quartz grains) are responsible for the formation of the distinctive stylolite roughness. A two-dimensional numerical model, i.e. a discrete linear elastic lattice spring model, is used to investigate the roughness evolving from an initially flat fluid filled interface induced by heterogeneities in the matrix. This model generates rough interfaces with the same scaling properties as natural stylolites. Furthermore two coinciding crossover phenomena in space and in time exist that separate length and timescales for which the roughening is either balanced by surface or elastic energies. The roughness and growth exponents are independent of the size, amount and the dissolution rate of the heterogeneities. This allows to conclude that the location of asperities is determined by a polimict multi-scale quenched noise, while the roughening process is governed by inherent processes i.e. the transition from a surface to an elastic energy dominated regime.rn
Resumo:
Salt marshes are coastal ecosystem in the upper intertidal zone between internal water and sea and are widely spread throughout Italy, from Friuli Venezia Giulia, in the North, to Sicily, in the South. These delicate environments are threatened by eutrophication, habitat conversion (for land reclaiming or agriculture) and climate change impacts such as sea level rise. The objectives of my thesis were to: 1) analyse the distribution and biomass of the perennial native cordgrass Spartina maritima (one of the most relevant foundation species in the low intertidal saltmarsh vegetation in the study region) at 7 sites along the Northern Adriatic coast and relate it to critical environmental parameters and 2) to carry out a nutrient manipulation experiment to detect nutrient enrichment effects on S. maritima biomass and vegetation characteristics. The survey showed significant differences among sites in biological response variables - i.e., live belowground, live aboveground biomass, above:belowground (R:S) biomass ratio, % cover, average height and stem density – which were mainly related to differences in nitrate, nitrite and phosphate contents in surface water. Preliminary results from the experiment (which is still ongoing) showed so far no significant effects of nutrient enrichment on live aboveground and belowground biomass, R:S ratio, leaf %Carbon, average height, stem density and random shoot height; however, a significantly higher (P=0.018) increase in leaf %Nitrogen content in treated plots indicated that nutrient uptake had occurred.
Resumo:
OBJECTIVE: To determine the effect of glucosamine, chondroitin, or the two in combination on joint pain and on radiological progression of disease in osteoarthritis of the hip or knee. Design Network meta-analysis. Direct comparisons within trials were combined with indirect evidence from other trials by using a Bayesian model that allowed the synthesis of multiple time points. MAIN OUTCOME MEASURE: Pain intensity. Secondary outcome was change in minimal width of joint space. The minimal clinically important difference between preparations and placebo was prespecified at -0.9 cm on a 10 cm visual analogue scale. DATA SOURCES: Electronic databases and conference proceedings from inception to June 2009, expert contact, relevant websites. Eligibility criteria for selecting studies Large scale randomised controlled trials in more than 200 patients with osteoarthritis of the knee or hip that compared glucosamine, chondroitin, or their combination with placebo or head to head. Results 10 trials in 3803 patients were included. On a 10 cm visual analogue scale the overall difference in pain intensity compared with placebo was -0.4 cm (95% credible interval -0.7 to -0.1 cm) for glucosamine, -0.3 cm (-0.7 to 0.0 cm) for chondroitin, and -0.5 cm (-0.9 to 0.0 cm) for the combination. For none of the estimates did the 95% credible intervals cross the boundary of the minimal clinically important difference. Industry independent trials showed smaller effects than commercially funded trials (P=0.02 for interaction). The differences in changes in minimal width of joint space were all minute, with 95% credible intervals overlapping zero. Conclusions Compared with placebo, glucosamine, chondroitin, and their combination do not reduce joint pain or have an impact on narrowing of joint space. Health authorities and health insurers should not cover the costs of these preparations, and new prescriptions to patients who have not received treatment should be discouraged.
Resumo:
Objective To analyse the available evidence on cardiovascular safety of non-steroidal anti-inflammatory drugs. Design Network meta-analysis. Data sources Bibliographic databases, conference proceedings, study registers, the Food and Drug Administration website, reference lists of relevant articles, and reports citing relevant articles through the Science Citation Index (last update July 2009). Manufacturers of celecoxib and lumiracoxib provided additional data. Study selection All large scale randomised controlled trials comparing any non-steroidal anti-inflammatory drug with other non-steroidal anti-inflammatory drugs or placebo. Two investigators independently assessed eligibility. Data extraction The primary outcome was myocardial infarction. Secondary outcomes included stroke, death from cardiovascular disease, and death from any cause. Two investigators independently extracted data. Data synthesis 31 trials in 116 429 patients with more than 115 000 patient years of follow-up were included. Patients were allocated to naproxen, ibuprofen, diclofenac, celecoxib, etoricoxib, rofecoxib, lumiracoxib, or placebo. Compared with placebo, rofecoxib was associated with the highest risk of myocardial infarction (rate ratio 2.12, 95% credibility interval 1.26 to 3.56), followed by lumiracoxib (2.00, 0.71 to 6.21). Ibuprofen was associated with the highest risk of stroke (3.36, 1.00 to 11.6), followed by diclofenac (2.86, 1.09 to 8.36). Etoricoxib (4.07, 1.23 to 15.7) and diclofenac (3.98, 1.48 to 12.7) were associated with the highest risk of cardiovascular death. Conclusions Although uncertainty remains, little evidence exists to suggest that any of the investigated drugs are safe in cardiovascular terms. Naproxen seemed least harmful. Cardiovascular risk needs to be taken into account when prescribing any non-steroidal anti-inflammatory drug.
Resumo:
BACKGROUND: Previous meta-analyses described moderate to large benefits of chondroitin in patients with osteoarthritis. However, recent large-scale trials did not find evidence of an effect. PURPOSE: To determine the effects of chondroitin on pain in patients with osteoarthritis. DATA SOURCES: The authors searched the Cochrane Central Register of Controlled Trials (1970 to 2006), MEDLINE (1966 to 2006), EMBASE (1980 to 2006), CINAHL (1970 to 2006), and conference proceedings; checked reference lists; and contacted authors. The last update of searches was performed on 30 November 2006. STUDY SELECTION: Studies were included if they were randomized or quasi-randomized, controlled trials that compared chondroitin with placebo or with no treatment in patients with osteoarthritis of the knee or hip. There were no language restrictions. DATA EXTRACTION: The authors extracted data in duplicate. Effect sizes were calculated from the differences in means of pain-related outcomes between treatment and control groups at the end of the trial, divided by the pooled SD. Trials were combined by using random-effects meta-analysis. DATA SYNTHESIS: 20 trials (3846 patients) contributed to the meta-analysis, which revealed a high degree of heterogeneity among the trials (I2 = 92%). Small trials, trials with unclear concealment of allocation, and trials that were not analyzed according to the intention-to-treat principle showed larger effects in favor of chondroitin than did the remaining trials. When the authors restricted the analysis to the 3 trials with large sample sizes and an intention-to-treat analysis, 40% of patients were included. This resulted in an effect size of -0.03 (95% CI, -0.13 to 0.07; I2 = 0%) and corresponded to a difference of 0.6 mm on a 10-cm visual analogue scale. A meta-analysis of 12 trials showed a pooled relative risk of 0.99 (CI, 0.76 to 1.31) for any adverse event. LIMITATIONS: For 9 trials, the authors had to use approximations to calculate effect sizes. Trial quality was generally low, heterogeneity among the trials made initial interpretation of results difficult, and exploring sources of heterogeneity in meta-regression and stratified analyses may be unreliable. CONCLUSIONS: Large-scale, methodologically sound trials indicate that the symptomatic benefit of chondroitin is minimal or nonexistent. Use of chondroitin in routine clinical practice should therefore be discouraged.
Resumo:
A protein of a biological sample is usually quantified by immunological techniques based on antibodies. Mass spectrometry offers alternative approaches that are not dependent on antibody affinity and avidity, protein isoforms, quaternary structures, or steric hindrance of antibody-antigen recognition in case of multiprotein complexes. One approach is the use of stable isotope-labeled internal standards; another is the direct exploitation of mass spectrometric signals recorded by LC-MS/MS analysis of protein digests. Here we assessed the peptide match score summation index based on probabilistic peptide scores calculated by the PHENYX protein identification engine for absolute protein quantification in accordance with the protein abundance index as proposed by Mann and co-workers (Rappsilber, J., Ryder, U., Lamond, A. I., and Mann, M. (2002) Large-scale proteomic analysis of the human spliceosome. Genome Res. 12, 1231-1245). Using synthetic protein mixtures, we demonstrated that this approach works well, although proteins can have different response factors. Applied to high density lipoproteins (HDLs), this new approach compared favorably to alternative protein quantitation methods like UV detection of protein peaks separated by capillary electrophoresis or quantitation of protein spots on SDS-PAGE. We compared the protein composition of a well defined HDL density class isolated from plasma of seven hypercholesterolemia subjects having low or high HDL cholesterol with HDL from nine normolipidemia subjects. The quantitative protein patterns distinguished individuals according to the corresponding concentration and distribution of cholesterol from serum lipid measurements of the same samples and revealed that hypercholesterolemia in unrelated individuals is the result of different deficiencies. The presented approach is complementary to HDL lipid analysis; does not rely on complicated sample treatment, e.g. chemical reactions, or antibodies; and can be used for projective clinical studies of larger patient groups.
Resumo:
BACKGROUND: The RUNX1 transcription factor gene is frequently mutated in sporadic myeloid and lymphoid leukemia through translocation, point mutation or amplification. It is also responsible for a familial platelet disorder with predisposition to acute myeloid leukemia (FPD-AML). The disruption of the largely unknown biological pathways controlled by RUNX1 is likely to be responsible for the development of leukemia. We have used multiple microarray platforms and bioinformatic techniques to help identify these biological pathways to aid in the understanding of why RUNX1 mutations lead to leukemia. RESULTS: Here we report genes regulated either directly or indirectly by RUNX1 based on the study of gene expression profiles generated from 3 different human and mouse platforms. The platforms used were global gene expression profiling of: 1) cell lines with RUNX1 mutations from FPD-AML patients, 2) over-expression of RUNX1 and CBFbeta, and 3) Runx1 knockout mouse embryos using either cDNA or Affymetrix microarrays. We observe that our datasets (lists of differentially expressed genes) significantly correlate with published microarray data from sporadic AML patients with mutations in either RUNX1 or its cofactor, CBFbeta. A number of biological processes were identified among the differentially expressed genes and functional assays suggest that heterozygous RUNX1 point mutations in patients with FPD-AML impair cell proliferation, microtubule dynamics and possibly genetic stability. In addition, analysis of the regulatory regions of the differentially expressed genes has for the first time systematically identified numerous potential novel RUNX1 target genes. CONCLUSION: This work is the first large-scale study attempting to identify the genetic networks regulated by RUNX1, a master regulator in the development of the hematopoietic system and leukemia. The biological pathways and target genes controlled by RUNX1 will have considerable importance in disease progression in both familial and sporadic leukemia as well as therapeutic implications.
Resumo:
This thesis is composed of three life-cycle analysis (LCA) studies of manufacturing to determine cumulative energy demand (CED) and greenhouse gas emissions (GHG). The methods proposed could reduce the environmental impact by reducing the CED in three manufacturing processes. First, industrial symbiosis is proposed and a LCA is performed on both conventional 1 GW-scaled hydrogenated amorphous silicon (a-Si:H)-based single junction and a-Si:H/microcrystalline-Si:H tandem cell solar PV manufacturing plants and such plants coupled to silane recycling plants. Using a recycling process that results in a silane loss of only 17 versus 85 percent, this results in a CED savings of 81,700 GJ and 290,000 GJ per year for single and tandem junction plants, respectively. This recycling process reduces the cost of raw silane by 68 percent, or approximately $22.6 and $79 million per year for a single and tandem 1 GW PV production facility, respectively. The results show environmental benefits of silane recycling centered around a-Si:H-based PV manufacturing plants. Second, an open-source self-replicating rapid prototype or 3-D printer, the RepRap, has the potential to reduce the environmental impact of manufacturing of polymer-based products, using distributed manufacturing paradigm, which is further minimized by the use of PV and improvements in PV manufacturing. Using 3-D printers for manufacturing provides the ability to ultra-customize products and to change fill composition, which increases material efficiency. An LCA was performed on three polymer-based products to determine the CED and GHG from conventional large-scale production and are compared to experimental measurements on a RepRap producing identical products with ABS and PLA. The results of this LCA study indicate that the CED of manufacturing polymer products can possibly be reduced using distributed manufacturing with existing 3-D printers under 89% fill and reduced even further with a solar photovoltaic system. The results indicate that the ability of RepRaps to vary fill has the potential to diminish environmental impact on many products. Third, one additional way to improve the environmental performance of this distributed manufacturing system is to create the polymer filament feedstock for 3-D printers using post-consumer plastic bottles. An LCA was performed on the recycling of high density polyethylene (HDPE) using the RecycleBot. The results of the LCA showed that distributed recycling has a lower CED than the best-case scenario used for centralized recycling. If this process is applied to the HDPE currently recycled in the U.S., more than 100 million MJ of energy could be conserved per annum along with significant reductions in GHG. This presents a novel path to a future of distributed manufacturing suited for both the developed and developing world with reduced environmental impact. From improving manufacturing in the photovoltaic industry with the use of recycling to recycling and manufacturing plastic products within our own homes, each step reduces the impact on the environment. The three coupled projects presented here show a clear potential to reduce the environmental impact of manufacturing and other processes by implementing complimenting systems, which have environmental benefits of their own in order to achieve a compounding effect of reduced CED and GHG.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.
Resumo:
OBJECTIVES This study sought to study the efficacy and safety of newer-generation drug-eluting stents (DES) compared with bare-metal stents (BMS) in an appropriately powered population of patients with ST-segment elevation myocardial infarction (STEMI). BACKGROUND Among patients with STEMI, early generation DES improved efficacy but not safety compared with BMS. Newer-generation DES, everolimus-eluting stents, and biolimus A9-eluting stents, have been shown to improve clinical outcomes compared with early generation DES. METHODS Individual patient data for 2,665 STEMI patients enrolled in 2 large-scale randomized clinical trials comparing newer-generation DES with BMS were pooled: 1,326 patients received a newer-generation DES (everolimus-eluting stent or biolimus A9-eluting stent), whereas the remaining 1,329 patients received a BMS. Random-effects models were used to assess differences between the 2 groups for the device-oriented composite endpoint of cardiac death, target-vessel reinfarction, and target-lesion revascularization and the patient-oriented composite endpoint of all-cause death, any infarction, and any revascularization at 1 year. RESULTS Newer-generation DES substantially reduce the risk of the device-oriented composite endpoint compared with BMS at 1 year (relative risk [RR]: 0.58; 95% confidence interval [CI]: 0.43 to 0.79; p = 0.0004). Similarly, the risk of the patient-oriented composite endpoint was lower with newer-generation DES than BMS (RR: 0.78; 95% CI: 0.63 to 0.96; p = 0.02). Differences in favor of newer-generation DES were driven by both a lower risk of repeat revascularization of the target lesion (RR: 0.33; 95% CI: 0.20 to 0.52; p < 0.0001) and a lower risk of target-vessel infarction (RR: 0.36; 95% CI: 0.14 to 0.92; p = 0.03). Newer-generation DES also reduced the risk of definite stent thrombosis (RR: 0.35; 95% CI: 0.16 to 0.75; p = 0.006) compared with BMS. CONCLUSIONS Among patients with STEMI, newer-generation DES improve safety and efficacy compared with BMS throughout 1 year. It remains to be determined whether the differences in favor of newer-generation DES are sustained during long-term follow-up.
Resumo:
The combination of scaled analogue experiments, material mechanics, X-ray computed tomography (XRCT) and Digital Volume Correlation techniques (DVC) is a powerful new tool not only to examine the 3 dimensional structure and kinematic evolution of complex deformation structures in scaled analogue experiments, but also to fully quantify their spatial strain distribution and complete strain history. Digital image correlation (DIC) is an important advance in quantitative physical modelling and helps to understand non-linear deformation processes. Optical non-intrusive (DIC) techniques enable the quantification of localised and distributed deformation in analogue experiments based either on images taken through transparent sidewalls (2D DIC) or on surface views (3D DIC). X-ray computed tomography (XRCT) analysis permits the non-destructive visualisation of the internal structure and kinematic evolution of scaled analogue experiments simulating tectonic evolution of complex geological structures. The combination of XRCT sectional image data of analogue experiments with 2D DIC only allows quantification of 2D displacement and strain components in section direction. This completely omits the potential of CT experiments for full 3D strain analysis of complex, non-cylindrical deformation structures. In this study, we apply digital volume correlation (DVC) techniques on XRCT scan data of “solid” analogue experiments to fully quantify the internal displacement and strain in 3 dimensions over time. Our first results indicate that the application of DVC techniques on XRCT volume data can successfully be used to quantify the 3D spatial and temporal strain patterns inside analogue experiments. We demonstrate the potential of combining DVC techniques and XRCT volume imaging for 3D strain analysis of a contractional experiment simulating the development of a non-cylindrical pop-up structure. Furthermore, we discuss various options for optimisation of granular materials, pattern generation, and data acquisition for increased resolution and accuracy of the strain results. Three-dimensional strain analysis of analogue models is of particular interest for geological and seismic interpretations of complex, non-cylindrical geological structures. The volume strain data enable the analysis of the large-scale and small-scale strain history of geological structures.
Resumo:
The brain is a complex neural network with a hierarchical organization and the mapping of its elements and connections is an important step towards the understanding of its function. Recent developments in diffusion-weighted imaging have provided the opportunity to reconstruct the whole-brain structural network in-vivo at a large scale level and to study the brain structural substrate in a framework that is close to the current understanding of brain function. However, methods to construct the connectome are still under development and they should be carefully evaluated. To this end, the first two studies included in my thesis aimed at improving the analytical tools specific to the methodology of brain structural networks. The first of these papers assessed the repeatability of the most common global and local network metrics used in literature to characterize the connectome, while in the second paper the validity of further metrics based on the concept of communicability was evaluated. Communicability is a broader measure of connectivity which accounts also for parallel and indirect connections. These additional paths may be important for reorganizational mechanisms in the presence of lesions as well as to enhance integration in the network. These studies showed good to excellent repeatability of global network metrics when the same methodological pipeline was applied, but more variability was detected when considering local network metrics or when using different thresholding strategies. In addition, communicability metrics have been found to add some insight into the integration properties of the network by detecting subsets of nodes that were highly interconnected or vulnerable to lesions. The other two studies used methods based on diffusion-weighted imaging to obtain knowledge concerning the relationship between functional and structural connectivity and about the etiology of schizophrenia. The third study integrated functional oscillations measured using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) as well as diffusion-weighted imaging data. The multimodal approach that was applied revealed a positive relationship between individual fluctuations of the EEG alpha-frequency and diffusion properties of specific connections of two resting-state networks. Finally, in the fourth study diffusion-weighted imaging was used to probe for a relationship between the underlying white matter tissue structure and season of birth in schizophrenia patients. The results are in line with the neurodevelopmental hypothesis of early pathological mechanisms as the origin of schizophrenia. The different analytical approaches selected in these studies also provide arguments for discussion of the current limitations in the analysis of brain structural networks. To sum up, the first studies presented in this thesis illustrated the potential of brain structural network analysis to provide useful information on features of brain functional segregation and integration using reliable network metrics. In the other two studies alternative approaches were presented. The common discussion of the four studies enabled us to highlight the benefits and possibilities for the analysis of the connectome as well as some current limitations.
Resumo:
The north-eastern escarpment of Madagascar harbours the island’s last remaining large-scale humid forest massifs surrounded by a small-scale agricultural mosaic. There is high deforestation, commonly thought to be caused by shifting cultivation practiced by local land users to produce upland rice. However, little is known about the dynamics between forest and shifting cultivation systems at a regional level. Our study presents a first attempt to quantify changes in the extent of forest and different agricultural land cover classes, and to identify the main dynamics of land cover change for two intervals, 1995–2005 and 2005–2011. Over the 16-year study period, the speed of forest loss increased, the total area of upland rice production remained almost stable, and the area of irrigated rice fields slightly increased. While our findings seem to confirm a general trend of land use intensification, deforestation through shifting cultivation is still on the rise. Deforestation mostly affects the small forest fragments interspersed in the agricultural mosaic and is slowly leading to a homogenization of the landscape. These findings have important implications for future interventions to slow forest loss in the region, as the processes of agricultural expansion through shifting cultivation versus intensified land use cannot per se be considered mutually exclusive.
Resumo:
Millennial to orbital-scale rainfall changes in the Mediterranean region and corresponding variations in vegetation patterns were the result of large-scale atmospheric reorganizations. In spite of recent efforts to reconstruct this variability using a range of proxy archives, the underlying physical mechanisms have remained elusive. Through the analysis of a new high-resolution sedimentary section from Lake Van (Turkey) along with climate modeling experiments, we identify massive droughts in the Eastern Med- iterranean for the past four glacial cycles, which have a pervasive link with known intervals of enhanced North Atlantic glacial iceberg calving, weaker Atlantic Meridional Overturning Circulation and Dansgaard-Oeschger cold conditions. On orbital timescales, the topographic effect of large Northern Hemisphere ice sheets and periods with minimum insolation seasonality further exacerbated drought intensities by suppressing both summer and winter precipitation.
Resumo:
Since the tragic events of September 11, 2001, the United States has engaged in building the infrastructure and developing the expertise necessary to protect its borders and its citizens from further attacks against its homeland. One approach has been the development of academic courses to educate individuals on the nature and dangers of subversive attacks and to prepare them to respond to attacks and other large-scale emergencies in their roles as working professionals, participating members of their communities, and collaborators with first responders. An initial review of the literature failed to reveal any university-based emergency management courses or programs with a disaster medical component, despite the public health significance and need for such programs. In the Fall of 2003, The School of Management at The University of Texas at Dallas introduced a continuing education Certificate in Emergency Management and Preparedness Program. This thesis will (1) describe the development and implementation of a new Disaster Medical Track as a component of this Certificate in Emergency Management and Preparedness Program, (2) analyze the need for and effectiveness of this Disaster Medical Track, and (3) propose improvements in the track based on this analysis. ^