887 resultados para path sampling
Resumo:
Background: The goal of this study was to determine whether site-specific differences in the subgingival microbiota could be detected by the checkerboard method in subjects with periodontitis. Methods: Subjects with at least six periodontal pockets with a probing depth (PD) between 5 and 7 mm were enrolled in the study. Subgingival plaque samples were collected with sterile curets by a single-stroke procedure at six selected periodontal sites from 161 subjects (966 subgingival sites). Subgingival bacterial samples were assayed with the checkerboard DNA-DNA hybridization method identifying 37 species. Results: Probing depths of 5, 6, and 7 mm were found at 50% (n = 483), 34% (n = 328), and 16% (n = 155) of sites, respectively. Statistical analysis failed to demonstrate differences in the sum of bacterial counts by tooth type (P = 0.18) or specific location of the sample (P = 0.78). With the exceptions of Campylobacter gracilis (P <0.001) and Actinomyces naeslundii (P <0.001), analysis by general linear model multivariate regression failed to identify subject or sample location factors as explanatory to microbiologic results. A trend of difference in bacterial load by tooth type was found for Prevotella nigrescens (P <0.01). At a cutoff level of >/=1.0 x 10(5), Porphyromonas gingivalis and Tannerella forsythia (previously T. forsythensis) were present at 48.0% to 56.3% and 46.0% to 51.2% of sampled sites, respectively. Conclusions: Given the similarities in the clinical evidence of periodontitis, the presence and levels of 37 species commonly studied in periodontitis are similar, with no differences between molar, premolar, and incisor/cuspid subgingival sites. This may facilitate microbiologic sampling strategies in subjects during periodontal therapy.
Resumo:
Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation
Resumo:
Proteins are linear chain molecules made out of amino acids. Only when they fold to their native states, they become functional. This dissertation aims to model the solvent (environment) effect and to develop & implement enhanced sampling methods that enable a reliable study of the protein folding problem in silico. We have developed an enhanced solvation model based on the solution to the Poisson-Boltzmann equation in order to describe the solvent effect. Following the quantum mechanical Polarizable Continuum Model (PCM), we decomposed net solvation free energy into three physical terms– Polarization, Dispersion and Cavitation. All the terms were implemented, analyzed and parametrized individually to obtain a high level of accuracy. In order to describe the thermodynamics of proteins, their conformational space needs to be sampled thoroughly. Simulations of proteins are hampered by slow relaxation due to their rugged free-energy landscape, with the barriers between minima being higher than the thermal energy at physiological temperatures. In order to overcome this problem a number of approaches have been proposed of which replica exchange method (REM) is the most popular. In this dissertation we describe a new variant of canonical replica exchange method in the context of molecular dynamic simulation. The advantage of this new method is the easily tunable high acceptance rate for the replica exchange. We call our method Microcanonical Replica Exchange Molecular Dynamic (MREMD). We have described the theoretical frame work, comment on its actual implementation, and its application to Trp-cage mini-protein in implicit solvent. We have been able to correctly predict the folding thermodynamics of this protein using our approach.
Resumo:
Range estimation is the core of many positioning systems such as radar, and Wireless Local Positioning Systems (WLPS). The estimation of range is achieved by estimating Time-of-Arrival (TOA). TOA represents the signal propagation delay between a transmitter and a receiver. Thus, error in TOA estimation causes degradation in range estimation performance. In wireless environments, noise, multipath, and limited bandwidth reduce TOA estimation performance. TOA estimation algorithms that are designed for wireless environments aim to improve the TOA estimation performance by mitigating the effect of closely spaced paths in practical (positive) signal-to-noise ratio (SNR) regions. Limited bandwidth avoids the discrimination of closely spaced paths. This reduces TOA estimation performance. TOA estimation methods are evaluated as a function of SNR, bandwidth, and the number of reflections in multipath wireless environments, as well as their complexity. In this research, a TOA estimation technique based on Blind signal Separation (BSS) is proposed. This frequency domain method estimates TOA in wireless multipath environments for a given signal bandwidth. The structure of the proposed technique is presented and its complexity and performance are theoretically evaluated. It is depicted that the proposed method is not sensitive to SNR, number of reflections, and bandwidth. In general, as bandwidth increases, TOA estimation performance improves. However, spectrum is the most valuable resource in wireless systems and usually a large portion of spectrum to support high performance TOA estimation is not available. In addition, the radio frequency (RF) components of wideband systems suffer from high cost and complexity. Thus, a novel, multiband positioning structure is proposed. The proposed technique uses the available (non-contiguous) bands to support high performance TOA estimation. This system incorporates the capabilities of cognitive radio (CR) systems to sense the available spectrum (also called white spaces) and to incorporate white spaces for high-performance localization. First, contiguous bands that are divided into several non-equal, narrow sub-bands that possess the same SNR are concatenated to attain an accuracy corresponding to the equivalent full band. Two radio architectures are proposed and investigated: the signal is transmitted over available spectrum either simultaneously (parallel concatenation) or sequentially (serial concatenation). Low complexity radio designs that handle the concatenation process sequentially and in parallel are introduced. Different TOA estimation algorithms that are applicable to multiband scenarios are studied and their performance is theoretically evaluated and compared to simulations. Next, the results are extended to non-contiguous, non-equal sub-bands with the same SNR. These are more realistic assumptions in practical systems. The performance and complexity of the proposed technique is investigated as well. This study’s results show that selecting bandwidth, center frequency, and SNR levels for each sub-band can adapt positioning accuracy.
Resumo:
Measuring shallow seismic sources provides a way to reveal processes that cannot be directly observed, but the correct interpretation and value of these signals depend on the ability to distinguish source from propagation effects. Furthermore, seismic signals produced by a resonating source can look almost identical to those produced by impulsive sources, but modified along the path. Distinguishing these two phenomena can be accomplished by examining the wavefield with small aperture arrays or by recording seismicity near to the source when possible. We examine source and path effects in two different environments: Bering Glacier, Alaska and Villarrica Volcano, Chile. Using three 3-element seismic arrays near the terminus of the Bering Glacier, we have identified and located both terminus calving and iceberg breakup events. We show that automated array analysis provided a robust way to locate icequake events using P waves. This analysis also showed that arrivals within the long-period codas were incoherent within the small aperture arrays, demonstrating that these codas previously attributed to crack resonance were in fact a result of a complicated path rather than a source effect. At Villarrica Volcano, seismometers deployed from near the vent to ~10 km revealed that a several cycle long-period source signal recorded at the vent appeared elongated in the far-field. We used data collected from the stations nearest to the vent to invert for the repetitive seismic source, and found it corresponded to a shallow force within the lava lake oriented N75°E and dipping 7° from horizontal. We also used this repetitive signal to search the data for additional seismic and infrasonic properties which included calculating seismic-acoustic delay times, volcano acoustic-seismic ratios and energies, event frequency, and real-time seismic amplitude measurements. These calculations revealed lava lake level and activity fluctuations consistent with lava lake level changes inferred from the persistent infrasonic tremor.
Resumo:
The dramatic period of progressive change in Montana that is documented "In the Crucible of Change" series really exploded with the election of Governors Forrest Anderson and Tom Judge. Anderson's single term saw the dispatching of the sales tax as an issue for a long period, the reorganization of the executive branch of state government and the revision of Montana's Constitution. As a former legislator, county attorney, Supreme Court justice, and Attorney General, Anderson brought unmatched experience to the governorship when elected. Tom Judge, although much younger (elected MT’s youngest governor at age 38 immediately following Anderson), also brought serious experience to the governorship: six years as a MT State Representative, two years as a MT State Senator, four years is Lieutenant Governor and significant business experience. The campaign and election of John F. Kennedy in 1960 spurred other young Americans to service, including Tom Judge. First elected in 1960, he rose rapidly through MT’s political-governmental hierarchy until he took over the governorship in time to implement many of the changes started in Governor Anderson’s term. But as a strong progressive leader in his own right, Governor Judge sponsored and implemented significant advancements of his own for Montana. Those accomplishments, however, are the subject of other films in this series. This film deals with Tom Judge’s early years – his rise to the governorship from when he returned home after college at Notre Dame and newspaper experience in Kentucky to his actual election in November 1972. That story is discussed in this episode by three major players in the effort, all directly involved in Tom Judge’s early years and path to the governorship: Sidney Armstrong, Larry Pettit and Kent Kleinkopf. Their recollections of the early Tom Judge and the period of his advancement to the governorship provide an insider’s perspective of the growth of this significant leader of the important period of progressive change documented “In the Crucible of Change.” Sidney Armstrong, President of Sidney Armstrong Consulting, serves on the board and as the Executive Director of the Greater Montana Foundation. Formerly Executive Director of the Montana Community Foundation (MCF), she has served on national committees and participated in national foundation initiatives. While at MCF, she worked extensively with MT Governors Racicot and Martz on the state charitable endowment tax credit and other endowed philanthropy issues. A member of MT Governor Thomas L. Judge’s staff in the 1970s, she was also part of Governor Brian Schweitzer’s 2004 Transition Team, continuing to serve as a volunteer advisor during his term. In the 1980s, Sidney also worked for the MT State AFL-CIO and the MT Democratic Party as well as working two sessions with the MT Senate as Assistant Secretary of the Senate and aide to the President. A Helena native, and great granddaughter of pioneer Montanans, Sidney has served on numerous nonprofit boards, and is currently a board member for the Montana History Foundation. Recently she served on the board of the Holter Museum of Art and was a Governor’s appointee to the Humanities Montana board. She is a graduate of the International School of Geneva, Switzerland and the University of Montana. Armstrong's Irish maternal immigrant great-grandparents, Thomas and Maria Cahill Cooney, came to Virginia City, MT in a covered wagon in 1865, looking for gold. Eventually, they settled on the banks of the Missouri River outside Helena as ranchers. She also has roots in Butte, MT, where her journalist father's family, both of whom were newspaper people, lived. Her father, Richard K. O’Malley, is also the author of a well-known book about Butte, Mile High, Mile Deep, recently re-published by Russell Chatham. She is the mother of four and the grandmother of eight. Dr. Lawrence K. Pettit (Larry Pettit) (b. 5/2/1937) has had a dual career in politics and higher education. In addition to being Montana’s first Commissioner of Higher Education (the subject of another film in this series); Pettit, of Lewistown, served as legislative assistant to U.S. Senators James E. Murray and Lee Metcalf, campaign manager, head of transition team and assistant to Montana Governor Thomas L. Judge; taught political science at The Pennsylvania State University (main campus), was chair of political science at Montana State University, Deputy Commissioner for Academic Programs at the Texas Higher Education Coordinating Board, Chancellor of the University System of South Texas (since merged with Texas A&M University), President of Southern Illinois University, and President of Indiana University of Pennsylvania from where he retired in 2003. He has served as chair of the Commission on Leadership for the American Council on Education, president of the National Association of (University) System Heads, and on many national and state boards and commissions in higher education. Pettit is author of “If You Live by the Sword: Politics in the Making and Unmaking of a University President.” More about Pettit is found at http://www.lawrencekpettit.com… Kent Kleinkopf of Missoula is co-founder of a firm with a national scope of business that specializes in litigation consultation, expert vocational testimony, and employee assistance programs. His partner (and wife of 45 years) Kathy, is an expert witness in the 27 year old business. Kent received a BA in History/Education from the University of Idaho and an MA in Economics from the University of Utah. The Kleinkopfs moved to Helena, MT in 1971 where he was Assistant to the Commissioner of State Lands (later Governor) Ted Schwinden. In early 1972 Kent volunteered full time in Lt. Governor Tom Judge’s campaign for Governor, driving the Lt. Governor extensively throughout Montana. After Judge was elected governor, Kent briefly joined the staff of Governor Forrest Anderson, then in 1973 transitioned to Judge’s Governor’s Office staff, where he became Montana’s first “Citizens’ Advocate.” In that capacity he fielded requests for assistance from citizens with concerns and information regarding State Agencies. While on the Governor’s staff, Kent continued as a travel aide with the governor both in Montana and nationally. In 1977 Kent was appointed Director of the MT Department of Business Regulation. That role included responsibility as Superintendent of Banking and Chairman of the State Banking Board, where Kent presided over the chartering of many banks, savings and loans, and credit unions. In 1981 the Kleinkopfs moved to Missoula and went into the business they run today. Kent was appointed by Governor Brian Schweitzer to the Board of the Montana Historical Society in 2006, was reappointed and continues to serve. Kathy and Kent have a daughter and son-in-law in Missoula.
Resumo:
Despite widespread use of species-area relationships (SARs), dispute remains over the most representative SAR model. Using data of small-scale SARs of Estonian dry grassland communities, we address three questions: (1) Which model describes these SARs best when known artifacts are excluded? (2) How do deviating sampling procedures (marginal instead of central position of the smaller plots in relation to the largest plot; single values instead of average values; randomly located subplots instead of nested subplots) influence the properties of the SARs? (3) Are those effects likely to bias the selection of the best model? Our general dataset consisted of 16 series of nested-plots (1 cm(2)-100 m(2), any-part system), each of which comprised five series of subplots located in the four corners and the centre of the 100-m(2) plot. Data for the three pairs of compared sampling designs were generated from this dataset by subsampling. Five function types (power, quadratic power, logarithmic, Michaelis-Menten, Lomolino) were fitted with non-linear regression. In some of the communities, we found extremely high species densities (including bryophytes and lichens), namely up to eight species in 1 cm(2) and up to 140 species in 100 m(2), which appear to be the highest documented values on these scales. For SARs constructed from nested-plot average-value data, the regular power function generally was the best model, closely followed by the quadratic power function, while the logarithmic and Michaelis-Menten functions performed poorly throughout. However, the relative fit of the latter two models increased significantly relative to the respective best model when the single-value or random-sampling method was applied, however, the power function normally remained far superior. These results confirm the hypothesis that both single-value and random-sampling approaches cause artifacts by increasing stochasticity in the data, which can lead to the selection of inappropriate models.
Impact of Orthorectification and Spatial Sampling on Maximum NDVI Composite Data in Mountain Regions