13 resultados para AM1 semi-empirical method

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lutein is a principal constituent of the human macular pigment. This study is composed of two projects. The first studies the conformational geometries of lutein and its potential adaptability in biological systems. The second is a study of the response of human subjects to lutein supplements. Using semi-empirical parametric method 3 (PM3) and density functional theory with the B3LYP/6-31G* basis set, the relative energies of s- cis conformers of lutein were determined. All 512 s-cis conformers were calculated with PM3. A smaller, representative group was also studied using density functional theory. PM3 results were correlated systematically to B3LYP values and this enables the results to be calibrated. The relative energies of the conformers range from 1-30 kcal/mole, and many are dynamically accessible at normal temperatures. Four commercial formulations containing lutein were studied. The serum and macular pigment (MP) responses of human subjects to these lutein supplements with doses of 9 or 20 mg/day were measured, relative to a placebo, over a six month period. In each instance, lutein levels in serum increased and correlated with MP increases. The results demonstrate that responses are significantly dependent upon formulation and that components other than lutein have an important influence serum response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose. The goal of this study is to improve the favorable molecular interactions between starch and PPC by addition of grafting monomers MA and ROM as compatibilizers, which would advance the mechanical properties of starch/PPC composites. ^ Methodology. DFT and semi-empirical methods based calculations were performed on three systems: (a) starch/PPC, (b) starch/PPC-MA, and (c) starch-ROM/PPC. Theoretical computations involved the determination of optimal geometries, binding-energies and vibrational frequencies of the blended polymers. ^ Findings. Calculations performed on five starch/PPC composites revealed hydrogen bond formation as the driving force behind stable composite formation, also confirmed by the negative relative energies of the composites indicating the existence of binding forces between the constituent co-polymers. The interaction between starch and PPC is also confirmed by the computed decrease in stretching CO and OH group frequencies participating in hydrogen bond formation, which agree qualitatively with the experimental values. ^ A three-step mechanism of grafting MA on PPC was proposed to improve the compatibility of PPC with starch. Nine types of 'blends' produced by covalent bond formation between starch and MA-grafted PPC were found to be energetically stable, with blends involving MA grafted at the 'B' and 'C' positions of PPC indicating a binding-energy increase of 6.8 and 6.2 kcal/mol, respectively, as compared to the non-grafted starch/PPC composites. A similar increase in binding-energies was also observed for three types of 'composites' formed by hydrogen bond formation between starch and MA-grafted PPC. ^ Next, grafting of ROM on starch and subsequent blend formation with PPC was studied. All four types of blends formed by the reaction of ROM-grafted starch with PPC were found to be more energetically stable as compared to the starch/PPC composite and starch/PPC-MA composites and blends. A blend of PPC and ROM grafted at the ' a&d12; ' position on amylose exhibited a maximal increase of 17.1 kcal/mol as compared with the starch/PPC-MA blend. ^ Conclusions. ROM was found to be a more effective compatibilizer in improving the favorable interactions between starch and PPC as compared to MA. The ' a&d12; ' position was found to be the most favorable attachment point of ROM to amylose for stable blend formation with PPC.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface water flow patterns in wetlands play a role in shaping substrates, biogeochemical cycling, and ecosystem characteristics. This paper focuses on the factors controlling flow across a large, shallow gradient subtropical wetland (Shark River Slough in Everglades National Park, USA), which displays vegetative patterning indicative of overland flow. Between July 2003 and December 2007, flow speeds at five sites were very low (s−1), and exhibited seasonal fluctuations that were correlated with seasonal changes in water depth but also showed distinctive deviations. Stepwise linear regression showed that upstream gate discharges, local stage gradients, and stage together explained 50 to 90% of the variance in flow speed at four of the five sites and only 10% at one site located close to a levee-canal combination. Two non-linear, semi-empirical expressions relating flow speeds to the local hydraulic gradient, water depths, and vegetative resistance accounted for 70% of the variance in our measured speed. The data suggest local-scale factors such as channel morphology, vegetation density, and groundwater exchanges must be considered along with landscape position and basin-scale geomorphology when examining the interactions between flow and community characteristics in low-gradient wetlands such as the Everglades.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper examines the nature of qualitative empirical studies published in the AHRD proceedings from 1999-2003 and discusses findings on method, rationale for method, data collection, sampling strategies, and integrity measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The financial community is well aware that continued underfunding of state and local government pension plans poses many public policy and fiduciary management concerns. However, a well-defined theoretical rationale has not been developed to explain why and how public sector pension plans underfund. This study uses three methods: a survey of national pension experts, an incomplete covariance panel method, and field interviews.^ A survey of national public sector pension experts was conducted to provide a conceptual framework by which underfunding could be evaluated. Experts suggest that plan design, fiscal stress, and political culture factors impact underfunding. However, experts do not agree with previous research findings that unions actively pursue underfunding to secure current wage increases.^ Within the conceptual framework and determinants identified by experts, several empirical regularities are documented for the first time. Analysis of 173 local government pension plans, observed from 1987 to 1992, was conducted. Findings indicate that underfunding occurs in plans that have lower retirement ages, increased costs due to benefit enhancements, when the sponsor faces current year operating deficits, or when a local government relies heavily on inelastic revenue sources. Results also suggest that elected officials artificially inflate interest rate assumptions to reduce current pension costs, consequently shifting these costs to future generations. In concurrence with some experts there is no data to support the assumption that highly unionized employees secure more funding than less unionized employees.^ Empirical results provide satisfactory but not overwhelming statistical power, and only minor predictive capacity. To further explore why underfunding occurs, field interviews were carried out with 62 local government officials. Practitioners indicated that perceived fiscal stress, the willingness of policymakers to advance funding, bargaining strategies used by union officials, apathy by employees and retirees, pension board composition, and the level of influence by internal pension experts has an impact on funding outcomes.^ A pension funding process model was posited by triangulating the expert survey, empirical findings, and field survey results. The funding process model should help shape and refine our theoretical knowledge of state and local government pension underfunding in the future. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces a new system for handwritten text recognition based on an improved neural network design. Most of the existing neural networks treat mean square error function as the standard error function. The system as proposed in this dissertation utilizes the mean quartic error function, where the third and fourth derivatives are non-zero. Consequently, many improvements on the training methods were achieved. The training results are carefully assessed before and after the update. To evaluate the performance of a training system, there are three essential factors to be considered, and they are from high to low importance priority: (1) error rate on testing set, (2) processing time needed to recognize a segmented character and (3) the total training time and subsequently the total testing time. It is observed that bounded training methods accelerate the training process, while semi-third order training methods, next-minimal training methods, and preprocessing operations reduce the error rate on the testing set. Empirical observations suggest that two combinations of training methods are needed for different case character recognition. Since character segmentation is required for word and sentence recognition, this dissertation provides also an effective rule-based segmentation method, which is different from the conventional adaptive segmentation methods. Dictionary-based correction is utilized to correct mistakes resulting from the recognition and segmentation phases. The integration of the segmentation methods with the handwritten character recognition algorithm yielded an accuracy of 92% for lower case characters and 97% for upper case characters. In the testing phase, the database consists of 20,000 handwritten characters, with 10,000 for each case. The testing phase on the recognition 10,000 handwritten characters required 8.5 seconds in processing time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in multiscale material modeling of structural concrete have created an upsurge of interest in the accurate evaluation of mechanical properties and volume fractions of its nano constituents. The task is accomplished by analyzing the response of a material to indentation, obtained as an outcome of a nanoindentation experiment, using a procedure called the Oliver and Pharr (OP) method. Despite its widespread use, the accuracy of this method is often questioned when it is applied to the data from heterogeneous materials or from the materials that show pile-up and sink-in during indentation, which necessitates the development of an alternative method. ^ In this study, a model is developed within the framework defined by contact mechanics to compute the nanomechanical properties of a material from its indentation response. Unlike the OP method, indentation energies are employed in the form of dimensionless constants to evaluate model parameters. Analysis of the load-displacement data pertaining to a wide range of materials revealed that the energy constants may be used to determine the indenter tip bluntness, hardness and initial unloading stiffness of the material. The proposed model has two main advantages: (1) it does not require the computation of the contact area, a source of error in the existing method; and (2) it incorporates the effect of peak indentation load, dwelling period and indenter tip bluntness on the measured mechanical properties explicitly. ^ Indentation tests are also carried out on samples from cement paste to validate the energy based model developed herein by determining the elastic modulus and hardness of different phases of the paste. As a consequence, it has been found that the model computes the mechanical properties in close agreement with that obtained by the OP method; a discrepancy, though insignificant, is observed more in the case of C-S-H than in the anhydrous phase. Nevertheless, the proposed method is computationally efficient, and thus it is highly suitable when the grid indentation technique is required to be performed. In addition, several empirical relations are developed that are found to be crucial in understanding the nanomechanical behavior of cementitious materials.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated the feasibility of using qualitative methods to provide empirical documentation of the long-term qualitative change in the life course trajectories of “at risk” youth in a school based positive youth development program (the Changing Lives Program—CLP). This work draws from life course theory for a developmental framework and from recent advances in the use of qualitative methods in general and a grounded theory approach in particular. Grounded theory provided a methodological framework for conceptualizing the use of qualitative methods for assessing qualitative life change. The study investigated the feasibility of using the Possible Selves Questionnaire-Qualitative Extension (PSQ-QE) for evaluating the impact of the program on qualitative change in participants' life trajectory relative to a non-intervention control group. Integrated Qualitative/Quantitative Data Analytic Strategies (IQ-DAS) that we have been developing a part of our program of research provided the data analytic framework for the study. ^ Change was evaluated in 85 at risk high school students in CLP high school counseling groups over three assessment periods (pre, post, and follow-up), and a non-intervention control group of 23 students over two assessment periods (pre and post). Intervention gains and maintenance and the extent to which these patterns of change were moderated by gender and ethnicity were evaluated using a mixed design Repeated Measures Multivariate Analysis of Variance (RMANOVA) in which Time (pre, post) was the within (repeated) factor and Condition, Gender, and Ethnicity the between group factors. The trends for the direction of qualitative change were positive from pre to post and maintained at the year-end follow-up. More important, the 3-way interaction for Time x Gender x Ethnicity was significant, Roy's Θ =. 205, F(2, 37) = 3.80, p <.032, indicating that the overall pattern of positive change was significantly moderated by gender and ethnicity. Thus, the findings also provided preliminary evidence for a positive impact of the youth development program on long-term change in life course trajectory, and were suggestive with respect to the issue of amenability to treatment, i.e., the identification of subgroups of individuals in a target population who are likely to be the most amenable or responsive to a treatment. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern IT infrastructures are constructed by large scale computing systems and administered by IT service providers. Manually maintaining such large computing systems is costly and inefficient. Service providers often seek automatic or semi-automatic methodologies of detecting and resolving system issues to improve their service quality and efficiency. This dissertation investigates several data-driven approaches for assisting service providers in achieving this goal. The detailed problems studied by these approaches can be categorized into the three aspects in the service workflow: 1) preprocessing raw textual system logs to structural events; 2) refining monitoring configurations for eliminating false positives and false negatives; 3) improving the efficiency of system diagnosis on detected alerts. Solving these problems usually requires a huge amount of domain knowledge about the particular computing systems. The approaches investigated by this dissertation are developed based on event mining algorithms, which are able to automatically derive part of that knowledge from the historical system logs, events and tickets. ^ In particular, two textual clustering algorithms are developed for converting raw textual logs into system events. For refining the monitoring configuration, a rule based alert prediction algorithm is proposed for eliminating false alerts (false positives) without losing any real alert and a textual classification method is applied to identify the missing alerts (false negatives) from manual incident tickets. For system diagnosis, this dissertation presents an efficient algorithm for discovering the temporal dependencies between system events with corresponding time lags, which can help the administrators to determine the redundancies of deployed monitoring situations and dependencies of system components. To improve the efficiency of incident ticket resolving, several KNN-based algorithms that recommend relevant historical tickets with resolutions for incoming tickets are investigated. Finally, this dissertation offers a novel algorithm for searching similar textual event segments over large system logs that assists administrators to locate similar system behaviors in the logs. Extensive empirical evaluation on system logs, events and tickets from real IT infrastructures demonstrates the effectiveness and efficiency of the proposed approaches.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.