931 resultados para change-point detection
Resumo:
To analyse and to compare the changes in the various optical coherence tomography (OCT), echogenicity and intravascular ultrasound virtual histology (VH) of the everolimus-eluting bioresorbable scaffold (ABSORB) degradation parameters during the first 12 months after ABSORB implantation. In the ABSORB study, changes in the appearance of the ABSORB scaffold were monitored over time using various intracoronary imaging modalities. The scaffold struts exhibited a progressive change in their black core area by OCT, in their ultrasound derived grey level intensity quantified by echogenicity, and in their backscattering ultrasound signal, identified as "pseudo dense-calcium" (DC) by VH.
Resumo:
PURPOSE To investigate whether Chlamydia pneumoniae and complement factors were present in surgically removed choroidal neovascular membranes (CNV) of patients with age-related macular degeneration (AMD). METHODS Paraffin sections of 26 CNV were stained for C. pneumoniae or the complement factors H (CFH) and C5, whereas macrophages were identified by positive CD68 staining. Clinical characteristics have been correlated to the immunohistochemical findings. RESULTS C. pneumoniae was found in 68% of the investigated membranes, and 88% of these membranes were also positive for CD68. Staining for CFH and C5 gave a positive reaction in 68 and 41% of the membranes, respectively. Patients with C5-positive membranes had significantly larger CNV mean area and were younger than patients with CFH-positive membranes at the operation time point. CONCLUSIONS Correlations between clinical symptoms and complement factor C5 could be shown. The results strengthen the hypothesis of an involvement of the complement system in AMD.
Resumo:
Mr. Kubon's project was inspired by the growing need for an automatic, syntactic analyser (parser) of Czech, which could be used in the syntactic processing of large amounts of texts. Mr. Kubon notes that such a tool would be very useful, especially in the field of corpus linguistics, where creating a large-scale "tree bank" (a collection of syntactic representations of natural language sentences) is a very important step towards the investigation of the properties of a given language. The work involved in syntactically parsing a whole corpus in order to get a representative set of syntactic structures would be almost inconceivable without the help of some kind of robust (semi)automatic parser. The need for the automatic natural language parser to be robust increases with the size of the linguistic data in the corpus or in any other kind of text which is going to be parsed. Practical experience shows that apart from syntactically correct sentences, there are many sentences which contain a "real" grammatical error. These sentences may be corrected in small-scale texts, but not generally in the whole corpus. In order to be able to complete the overall project, it was necessary to address a number of smaller problems. These were; 1. the adaptation of a suitable formalism able to describe the formal grammar of the system; 2. the definition of the structure of the system's dictionary containing all relevant lexico-syntactic information, and the development of a formal grammar able to robustly parse Czech sentences from the test suite; 3. filling the syntactic dictionary with sample data allowing the system to be tested and debugged during its development (about 1000 words); 4. the development of a set of sample sentences containing a reasonable amount of grammatical and ungrammatical phenomena covering some of the most typical syntactic constructions being used in Czech. Number 3, building a formal grammar, was the main task of the project. The grammar is of course far from complete (Mr. Kubon notes that it is debatable whether any formal grammar describing a natural language may ever be complete), but it covers the most frequent syntactic phenomena, allowing for the representation of a syntactic structure of simple clauses and also the structure of certain types of complex sentences. The stress was not so much on building a wide coverage grammar, but on the description and demonstration of a method. This method uses a similar approach as that of grammar-based grammar checking. The problem of reconstructing the "correct" form of the syntactic representation of a sentence is closely related to the problem of localisation and identification of syntactic errors. Without a precise knowledge of the nature and location of syntactic errors it is not possible to build a reliable estimation of a "correct" syntactic tree. The incremental way of building the grammar used in this project is also an important methodological issue. Experience from previous projects showed that building a grammar by creating a huge block of metarules is more complicated than the incremental method, which begins with the metarules covering most common syntactic phenomena first, and adds less important ones later, especially from the point of view of testing and debugging the grammar. The sample of the syntactic dictionary containing lexico-syntactical information (task 4) now has slightly more than 1000 lexical items representing all classes of words. During the creation of the dictionary it turned out that the task of assigning complete and correct lexico-syntactic information to verbs is a very complicated and time-consuming process which would itself be worth a separate project. The final task undertaken in this project was the development of a method allowing effective testing and debugging of the grammar during the process of its development. The problem of the consistency of new and modified rules of the formal grammar with the rules already existing is one of the crucial problems of every project aiming at the development of a large-scale formal grammar of a natural language. This method allows for the detection of any discrepancy or inconsistency of the grammar with respect to a test-bed of sentences containing all syntactic phenomena covered by the grammar. This is not only the first robust parser of Czech, but also one of the first robust parsers of a Slavic language. Since Slavic languages display a wide range of common features, it is reasonable to claim that this system may serve as a pattern for similar systems in other languages. To transfer the system into any other language it is only necessary to revise the grammar and to change the data contained in the dictionary (but not necessarily the structure of primary lexico-syntactic information). The formalism and methods used in this project can be used in other Slavic languages without substantial changes.
Resumo:
[1] Instrumental temperature series are often affected by artificial breaks (“break points”) due to (e.g.,) changes in station location, land-use, or instrumentation. The Swiss climate observation network offers a high number and density of stations, many long and relatively complete daily to sub-daily temperature series, and well-documented station histories (i.e., metadata). However, for many climate observation networks outside of Switzerland, detailed station histories are missing, incomplete, or inaccessible. To correct these records, the use of reliable statistical break detection methods is necessary. Here, we apply three statistical break detection methods to high-quality Swiss temperature series and use the available metadata to assess the methods. Due to the complex terrain in Switzerland, we are able to assess these methods under specific local conditions such as the Foehn or crest situations. We find that the temperature series of all stations are affected by artificial breaks (average = 1 break point / 48 years) with discrepancies in the abilities of the methods to detect breaks. However, by combining the three statistical methods, almost all of the detected break points are confirmed by metadata. In most cases, these break points are ascribed to a combination of factors in the station history.
Resumo:
BACKGROUND: Cystic fibrosis (CF) is associated with at least 1 pathogen point sequence variant on each CFTR allele. Some symptomatic patients, however, have only 1 detectable pathogen sequence variant and carry, on the other allele, a large deletion that is not detected by conventional screening methods. METHODS: For relative quantitative real-time PCR detection of large deletions in the CFTR gene, we designed DNA-specific primers for each exon of the gene and primers for a reference gene (beta2-microglobulin). For PCR we used a LightCycler system (Roche) and calculated the gene-dosage ratio of CFTR to beta2-microglobulin. We tested the method by screening all 27 exons in 3 healthy individuals and 2 patients with only 1 pathogen sequence variant. We then performed specific deletion screenings in 10 CF patients with known large deletions and a blinded analysis in which we screened 24 individuals for large deletions by testing 8 of 27 exons. RESULTS: None of the ratios for control samples were false positive (for deletions or duplications); moreover, for all samples from patients with known large deletions, the calculated ratios for deleted exons were close to 0.5. In addition, the results from the blinded analysis demonstrated that our method can also be used for the screening of single individuals. CONCLUSIONS: The LightCycler assay allows reliable and rapid screening for large deletions in the CFTR gene and detects the copy number of all 27 exons.
Resumo:
The AEGISS (Ascertainment and Enhancement of Gastrointestinal Infection Surveillance and Statistics) project aims to use spatio-temporal statistical methods to identify anomalies in the space-time distribution of non-specific, gastrointestinal infections in the UK, using the Southampton area in southern England as a test-case. In this paper, we use the AEGISS project to illustrate how spatio-temporal point process methodology can be used in the development of a rapid-response, spatial surveillance system. Current surveillance of gastroenteric disease in the UK relies on general practitioners reporting cases of suspected food-poisoning through a statutory notification scheme, voluntary laboratory reports of the isolation of gastrointestinal pathogens and standard reports of general outbreaks of infectious intestinal disease by public health and environmental health authorities. However, most statutory notifications are made only after a laboratory reports the isolation of a gastrointestinal pathogen. As a result, detection is delayed and the ability to react to an emerging outbreak is reduced. For more detailed discussion, see Diggle et al. (2003). A new and potentially valuable source of data on the incidence of non-specific gastro-enteric infections in the UK is NHS Direct, a 24-hour phone-in clinical advice service. NHS Direct data are less likely than reports by general practitioners to suffer from spatially and temporally localized inconsistencies in reporting rates. Also, reporting delays by patients are likely to be reduced, as no appointments are needed. Against this, NHS Direct data sacrifice specificity. Each call to NHS Direct is classified only according to the general pattern of reported symptoms (Cooper et al, 2003). The current paper focuses on the use of spatio-temporal statistical analysis for early detection of unexplained variation in the spatio-temporal incidence of non-specific gastroenteric symptoms, as reported to NHS Direct. Section 2 describes our statistical formulation of this problem, the nature of the available data and our approach to predictive inference. Section 3 describes the stochastic model. Section 4 gives the results of fitting the model to NHS Direct data. Section 5 shows how the model is used for spatio-temporal prediction. The paper concludes with a short discussion.
Resumo:
With recent advances in mass spectrometry techniques, it is now possible to investigate proteins over a wide range of molecular weights in small biological specimens. This advance has generated data-analytic challenges in proteomics, similar to those created by microarray technologies in genetics, namely, discovery of "signature" protein profiles specific to each pathologic state (e.g., normal vs. cancer) or differential profiles between experimental conditions (e.g., treated by a drug of interest vs. untreated) from high-dimensional data. We propose a data analytic strategy for discovering protein biomarkers based on such high-dimensional mass-spectrometry data. A real biomarker-discovery project on prostate cancer is taken as a concrete example throughout the paper: the project aims to identify proteins in serum that distinguish cancer, benign hyperplasia, and normal states of prostate using the Surface Enhanced Laser Desorption/Ionization (SELDI) technology, a recently developed mass spectrometry technique. Our data analytic strategy takes properties of the SELDI mass-spectrometer into account: the SELDI output of a specimen contains about 48,000 (x, y) points where x is the protein mass divided by the number of charges introduced by ionization and y is the protein intensity of the corresponding mass per charge value, x, in that specimen. Given high coefficients of variation and other characteristics of protein intensity measures (y values), we reduce the measures of protein intensities to a set of binary variables that indicate peaks in the y-axis direction in the nearest neighborhoods of each mass per charge point in the x-axis direction. We then account for a shifting (measurement error) problem of the x-axis in SELDI output. After these pre-analysis processing of data, we combine the binary predictors to generate classification rules for cancer, benign hyperplasia, and normal states of prostate. Our approach is to apply the boosting algorithm to select binary predictors and construct a summary classifier. We empirically evaluate sensitivity and specificity of the resulting summary classifiers with a test dataset that is independent from the training dataset used to construct the summary classifiers. The proposed method performed nearly perfectly in distinguishing cancer and benign hyperplasia from normal. In the classification of cancer vs. benign hyperplasia, however, an appreciable proportion of the benign specimens were classified incorrectly as cancer. We discuss practical issues associated with our proposed approach to the analysis of SELDI output and its application in cancer biomarker discovery.
Resumo:
Edges are important cues defining coherent auditory objects. As a model of auditory edges, sound on- and offset are particularly suitable to study their neural underpinnings because they contrast a specific physical input against no physical input. Change from silence to sound, that is onset, has extensively been studied and elicits transient neural responses bilaterally in auditory cortex. However, neural activity associated with sound onset is not only related to edge detection but also to novel afferent inputs. Edges at the change from sound to silence, that is offset, are not confounded by novel physical input and thus allow to examine neural activity associated with sound edges per se. In the first experiment, we used silent acquisition functional magnetic resonance imaging and found that the offset of pulsed sound activates planum temporale, superior temporal sulcus and planum polare of the right hemisphere. In the planum temporale and the superior temporal sulcus, offset response amplitudes were related to the pulse repetition rate of the preceding stimulation. In the second experiment, we found that these offset-responsive regions were also activated by single sound pulses, onset of sound pulse sequences and single sound pulse omissions within sound pulse sequences. However, they were not active during sustained sound presentation. Thus, our data show that circumscribed areas in right temporal cortex are specifically involved in identifying auditory edges. This operation is crucial for translating acoustic signal time series into coherent auditory objects.
Resumo:
The purpose of the study was to evaluate observer performance in the detection of pneumothorax with cesium iodide and amorphous silicon flat-panel detector radiography (CsI/a-Si FDR) presented as 1K and 3K soft-copy images. Forty patients with and 40 patients without pneumothorax diagnosed on previous and subsequent digital storage phosphor radiography (SPR, gold standard) had follow-up chest radiographs with CsI/a-Si FDR. Four observers confirmed or excluded the diagnosis of pneumothorax according to a five-point scale first on the 1K soft-copy image and then with help of 3K zoom function (1K monitor). Receiver operating characteristic (ROC) analysis was performed for each modality (1K and 3K). The area under the curve (AUC) values for each observer were 0.7815, 0.7779, 0.7946 and 0.7066 with 1K-matrix soft copies and 0.8123, 0.7997, 0.8078 and 0.7522 with 3K zoom. Overall detection of pneumothorax was better with 3K zoom. Differences between the two display methods were not statistically significant in 3 of 4 observers (p-values between 0.13 and 0.44; observer 4: p = 0.02). The detection of pneumothorax with 3K zoom is better than with 1K soft copy but not at a statistically significant level. Differences between both display methods may be subtle. Still, our results indicate that 3K zoom should be employed in clinical practice.
Resumo:
The objective of this study is to gain a quantitative understanding of land use and land cover change (LULCC) that have occurred in a rural Nicaraguan municipality by analyzing Landsat 5 Thematic Mapper (TM) images. By comparing the potential extent of tropical dry forest (TDF) with Landsat 5 TM images, this study analyzes the loss of this forest type on a local level for the municipality of San Juan de Cinco Pinos (63.5 km2) in the Department of Chinandega. Change detection analysis shows where and how land use has changed from 1985 to the present. From 1985 to 2011, nearly 15% of the TDF in San Juan de Cinco Pinos was converted to other land uses. Of the 1434.2 ha of TDF that was present in 1985, 1223.64 ha remained in 2011. The deforestation is primarily a result of agricultural expansion and fuelwood extraction. If current rates of TDF deforestation continue, the municipality faces the prospect of losing its forest cover within the next few decades.
Resumo:
Routine bridge inspections require labor intensive and highly subjective visual interpretation to determine bridge deck surface condition. Light Detection and Ranging (LiDAR) a relatively new class of survey instrument has become a popular and increasingly used technology for providing as-built and inventory data in civil applications. While an increasing number of private and governmental agencies possess terrestrial and mobile LiDAR systems, an understanding of the technology’s capabilities and potential applications continues to evolve. LiDAR is a line-of-sight instrument and as such, care must be taken when establishing scan locations and resolution to allow the capture of data at an adequate resolution for defining features that contribute to the analysis of bridge deck surface condition. Information such as the location, area, and volume of spalling on deck surfaces, undersides, and support columns can be derived from properly collected LiDAR point clouds. The LiDAR point clouds contain information that can provide quantitative surface condition information, resulting in more accurate structural health monitoring. LiDAR scans were collected at three study bridges, each of which displayed a varying degree of degradation. A variety of commercially available analysis tools and an independently developed algorithm written in ArcGIS Python (ArcPy) were used to locate and quantify surface defects such as location, volume, and area of spalls. The results were visual and numerically displayed in a user-friendly web-based decision support tool integrating prior bridge condition metrics for comparison. LiDAR data processing procedures along with strengths and limitations of point clouds for defining features useful for assessing bridge deck condition are discussed. Point cloud density and incidence angle are two attributes that must be managed carefully to ensure data collected are of high quality and useful for bridge condition evaluation. When collected properly to ensure effective evaluation of bridge surface condition, LiDAR data can be analyzed to provide a useful data set from which to derive bridge deck condition information.
Resumo:
A post classification change detection technique based on a hybrid classification approach (unsupervised and supervised) was applied to Landsat Thematic Mapper (TM), Landsat Enhanced Thematic Plus (ETM+), and ASTER images acquired in 1987, 2000 and 2004 respectively to map land use/cover changes in the Pic Macaya National Park in the southern region of Haiti. Each image was classified individually into six land use/cover classes: built-up, agriculture, herbaceous, open pine forest, mixed forest, and barren land using unsupervised ISODATA and maximum likelihood supervised classifiers with the aid of field collected ground truth data collected in the field. Ground truth information, collected in the field in December 2007, and including equalized stratified random points which were visual interpreted were used to assess the accuracy of the classification results. The overall accuracy of the land classification for each image was respectively: 1987 (82%), 2000 (82%), 2004 (87%). A post classification change detection technique was used to produce change images for 1987 to 2000, 1987 to 2004, and 2000 to 2004. It was found that significant changes in the land use/cover occurred over the 17- year period. The results showed increases in built up (from 10% to 17%) and herbaceous (from 5% to 14%) areas between 1987 and 2004. The increase of herbaceous was mostly caused by the abandonment of exhausted agriculture lands. At the same time, open pine forest and mixed forest areas lost (75%) and (83%) of their area to other land use/cover types. Open pine forest (from 20% to 14%) and mixed forest (from18 to 12%) were transformed into agriculture area or barren land. This study illustrated the continuing deforestation, land degradation and soil erosion in the region, which in turn is leading to decrease in vegetative cover. The study also showed the importance of Remote Sensing (RS) and Geographic Information System (GIS) technologies to estimate timely changes in the land use/cover, and to evaluate their causes in order to design an ecological based management plan for the park.
Resumo:
The Acoustic emission (AE) technique, as one of non-intrusive and nondestructive evaluation techniques, acquires and analyzes the signals emitting from deformation or fracture of materials/structures under service loading. The AE technique has been successfully applied in damage detection in various materials such as metal, alloy, concrete, polymers and other composite materials. In this study, the AE technique was used for detecting crack behavior within concrete specimens under mechanical and environmental frost loadings. The instrumentations of the AE system used in this study include a low-frequency AE sensor, a computer-based data acquisition device and a preamplifier linking the AE sensor and the data acquisition device. The AE system purchased from Mistras Group was used in this study. The AE technique was applied to detect damage with the following laboratory tests: the pencil lead test, the mechanical three-point single-edge notched beam bending (SEB) test, and the freeze-thaw damage test. Firstly, the pencil lead test was conducted to verify the attenuation phenomenon of AE signals through concrete materials. The value of attenuation was also quantified. Also, the obtained signals indicated that this AE system was properly setup to detect damage in concrete. Secondly, the SEB test with lab-prepared concrete beam was conducted by employing Mechanical Testing System (MTS) and AE system. The cumulative AE events and the measured loading curves, which both used the crack-tip open displacement (CTOD) as the horizontal coordinate, were plotted. It was found that the detected AE events were qualitatively correlated with the global force-displacement behavior of the specimen. The Weibull distribution was vii proposed to quantitatively describe the rupture probability density function. The linear regression analysis was conducted to calibrate the Weibull distribution parameters with detected AE signals and to predict the rupture probability as a function of CTOD for the specimen. Finally, the controlled concrete freeze-thaw cyclic tests were designed and the AE technique was planned to investigate the internal frost damage process of concrete specimens.
Resumo:
One-hundred years ago, in 1914, male voters in Montana (MT) extended suffrage (voting rights) to women six years before the 19th Amendment to the US Constitution was ratified and provided that right to women in all states. The long struggle for women’s suffrage was energized in the progressive era and Jeanette Rankin of Missoula emerged as a leader of the campaign; in 1912 both major MT political party platforms supported women suffrage. In the 1914 election, 41,000 male voters supported woman suffrage while nearly 38,000 opposed it. MT was not only ahead of the curve on women suffrage, but just two years later in 1916 elected Jeanette Rankin as the first woman ever elected to the United States Congress. Rankin became a national leader for women's equality. In her commitment to equality, she opposed US entry into World War I, partially because she said she could not support men being made to go to war if women were not allowed to serve alongside them. During MT’s initial progressive era, women in MT not only pursued equality for themselves (the MT Legislature passed an equal pay act in 1919), but pursued other social improvements, such as temperance/prohibition. Well-known national women leaders such as Carrie Nation and others found a welcome in MT during the period. Women's role in the trade union movement was evidenced in MT by the creation of the Women's Protective Union in Butte, the first union in America dedicated solely to women workers. But Rankin’s defeat following her vote against World War I was used as a way for opponents to advocate a conservative, traditionalist perspective on women's rights in MT. Just as we then entered a period in MT where the “copper collar” was tightened around MT economically and politically by the Anaconda Company and its allies, we also found a different kind of conservative, traditionalist collar tightened around the necks of MT women. The recognition of women's role during World War II, represented by “Rosie the Riveter,” made it more difficult for that conservative, traditionalist approach to be forever maintained. In addition, women's role in MT agriculture – family farms and ranches -- spoke strongly to the concept of equality, as farm wives were clearly active partners in the agricultural enterprises. But rural MT was, by and large, the bastion of conservative values relative to the position of women in society. As the period of “In the Crucible of Change” began, the 1965 MT Legislature included only three women. In 1967 and 1969 only one woman legislator served. In 1971 the number went up to two, including one of our guests, Dorothy Bradley. It was only after the Constitutional Convention, which featured 19 women delegates, that the barrier was broken. The 1973 Legislature saw 9 women elected. The 1975 and 1977 sessions had 14 women legislators; 15 were elected for the 1979 session. At that time progressive women and men in the Legislature helped implement the equality provisions of the new MT Constitution, ratified the federal Equal Rights Amendment in 1974, and held back national and local conservatives forces which sought in later Legislatures to repeal that ratification. As with the national movement at the time, MT women sought and often succeeded in adopting legal mechanisms that protected women’s equality, while full equality in the external world remained (and remains) a treasured objective. The story of the re-emergence of Montana’s women’s movement in the 1970s is discussed in this chapter by three very successful and prominent women who were directly involved in the effort: Dorothy Bradley, Marilyn Wessel, and Jane Jelinski. Their recollections of the political, sociological and cultural path Montana women pursued in the 1970s and the challenges and opposition they faced provide an insider’s perspective of the battle for equality for women under the Big Sky “In the Crucible of Change.” Dorothy Bradley grew up in Bozeman, Montana; received her Bachelor of Arts Phi Beta Kappa from Colorado College, Colorado Springs, in 1969 with a Distinction in Anthropology; and her Juris Doctor from American University in Washington, D.C., in 1983. In 1970, at the age of 22, following the first Earth Day and running on an environmental platform, Ms. Bradley won a seat in the 1971 Montana House of Representatives where she served as the youngest member and only woman. Bradley established a record of achievement on environmental & progressive legislation for four terms, before giving up the seat to run a strong second to Pat Williams for the Democratic nomination for an open seat in Montana’s Western Congressional District. After becoming an attorney and an expert on water law, she returned to the Legislature for 4 more terms in the mid-to-late 1980s. Serving a total of eight terms, Dorothy was known for her leadership on natural resources, tax reform, economic development, and other difficult issues during which time she gained recognition for her consensus-building approach. Campaigning by riding her horse across the state, Dorothy was the Democratic nominee for Governor in 1992, losing the race by less than a percentage point. In 1993 she briefly taught at a small rural school next to the Northern Cheyenne Indian Reservation. She was then hired as the Director of the Montana University System Water Center, an education and research arm of Montana State University. From 2000 - 2008 she served as the first Gallatin County Court Administrator with the task of collaboratively redesigning the criminal justice system. She currently serves on One Montana’s Board, is a National Advisor for the American Prairie Foundation, and is on NorthWestern Energy’s Board of Directors. Dorothy was recognized with an Honorary Doctorate from her alma mater, Colorado College, was named Business Woman of the Year by the Bozeman Chamber of Commerce and MSU Alumni Association, and was Montana Business and Professional Women’s Montana Woman of Achievement. Marilyn Wessel was born in Iowa, lived and worked in Los Angeles, California, and Washington, D.C. before moving to Bozeman in 1972. She has an undergraduate degree in journalism from Iowa State University, graduate degree in public administration from Montana State University, certification from the Harvard University Institute for Education Management, and served a senior internship with the U.S. Congress, Montana delegation. In Montana Marilyn has served in a number of professional positions, including part-time editor for the Montana Cooperative Extension Service, News Director for KBMN Radio, Special Assistant to the President and Director of Communications at Montana State University, Director of University Relations at Montana State University and Dean and Director of the Museum of the Rockies at MSU. Marilyn retired from MSU as Dean Emeritus in 2003. Her past Board Service includes Montana State Merit System Council, Montana Ambassadors, Vigilante Theater Company, Montana State Commission on Practice, Museum of the Rockies, Helena Branch of the Ninth District Federal Reserve Bank, Burton K. Wheeler Center for Public Policy, Bozeman Chamber of Commerce, and Friends of KUSM Public Television. Marilyn’s past publications and productions include several articles on communications and public administration issues as well as research, script preparation and presentation of several radio documentaries and several public television programs. She is co-author of one book, 4-H An American Idea: A History of 4-H. Marilyn’s other past volunteer activities and organizations include Business and Professional Women, Women's Political Caucus, League of Women Voters, and numerous political campaigns. She is currently engaged professionally in museum-related consulting and part-time teaching at Montana State University as well as serving on the Editorial Board of the Bozeman Daily Chronicle and a member of Pilgrim Congregational Church and Family Promise. Marilyn and her husband Tom, a retired MSU professor, live in Bozeman. She enjoys time with her children and grandchildren, hiking, golf, Italian studies, cooking, gardening and travel. Jane Jelinski is a Wisconsin native, with a BA from Fontbonne College in St. Louis, MO who taught fifth and seventh grades prior to moving to Bozeman in 1973. A stay-at-home mom with a five year old daughter and an infant son, she was promptly recruited by the Gallatin Women’s Political Caucus to conduct a study of Sex-Role Stereotyping in K Through 6 Reading Text Books in the Bozeman School District. Sociologist Dr. Louise Hale designed the study and did the statistical analysis and Jane read all the texts, entered the data and wrote the report. It was widely disseminated across Montana and received attention of the press. Her next venture into community activism was to lead the successful effort to downzone her neighborhood which was under threat of encroaching business development. Today the neighborhood enjoys the protections of a Historic Preservation District. During this time she earned her MPA from Montana State University. Subsequently Jane founded the Gallatin Advocacy Program for Developmentally Disabled Adults in 1978 and served as its Executive Director until her appointment to the Gallatin County Commission in 1984, a controversial appointment which she chronicled in the Fall issue of the Gallatin History Museum Quarterly. Copies of the issue can be ordered through: http://gallatinhistorymuseum.org/the-museum-bookstore/shop/. Jane was re-elected three times as County Commissioner, serving fourteen years. She was active in the Montana Association of Counties (MACO) and was elected its President in 1994. She was also active in the National Association of Counties, serving on numerous policy committees. In 1998 Jane resigned from the County Commission 6 months before the end of her final term to accept the position of Assistant Director of MACO, from where she lobbied for counties, provided training and research for county officials, and published a monthly newsletter. In 2001 she became Director of the MSU Local Government Center where she continued to provide training and research for county and municipal officials across MT. There she initiated the Montana Mayors Academy in partnership with MMIA. She taught State and Local Government, Montana Politics and Public Administration in the MSU Political Science Department before retiring in 2008. Jane has been married to Jack for 46 years, has two grown children and three grandchildren.
Resumo:
Colonization with more than one distinct strain of the same species, also termed cocolonization, is a prerequisite for horizontal gene transfer between pneumococcal strains that may lead to change of the capsular serotype. Capsule switch has become an important issue since the introduction of conjugated pneumococcal polysaccharide vaccines. There is, however, a lack of techniques to detect multiple colonization by S. pneumoniae strains directly in nasopharyngeal samples. Two hundred eighty-seven nasopharyngeal swabs collected during the prevaccine era within a nationwide surveillance program were analyzed by a novel technique for the detection of cocolonization, based on PCR amplification of a noncoding region adjacent to the pneumolysin gene (plyNCR) and restriction fragment length polymorphism (RFLP) analysis. The numbers of strains and their relative abundance in cocolonized samples were determined by terminal RFLP. The pneumococcal carriage rate found by PCR was 51.6%, compared to 40.0% found by culture. Cocolonization was present in 9.5% (10/105) of samples, most (9/10) of which contained two strains in a ratio of between 1:1 and 17:1. Five of the 10 cocolonized samples showed combinations of vaccine types only (n = 2) or combinations of nonvaccine types only (n = 3). Carriers of multiple pneumococcal strains had received recent antibiotic treatment more often than those colonized with a single strain (33% versus 9%, P = 0.025). This new technique allows for the rapid and economical study of pneumococcal cocolonization in nasopharyngeal swabs. It will be valuable for the surveillance of S. pneumoniae epidemiology under vaccine selection pressure.