932 resultados para Higher order interior points method (HOIPM)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The physics of self-organization and complexity is manifested on a variety of biological scales, from large ecosystems to the molecular level. Protein molecules exhibit characteristics of complex systems in terms of their structure, dynamics, and function. Proteins have the extraordinary ability to fold to a specific functional three-dimensional shape, starting from a random coil, in a biologically relevant time. How they accomplish this is one of the secrets of life. In this work, theoretical research into understanding this remarkable behavior is discussed. Thermodynamic and statistical mechanical tools are used in order to investigate the protein folding dynamics and stability. Theoretical analyses of the results from computer simulation of the dynamics of a four-helix bundle show that the excluded volume entropic effects are very important in protein dynamics and crucial for protein stability. The dramatic effects of changing the size of sidechains imply that a strategic placement of amino acid residues with a particular size may be an important consideration in protein engineering. Another investigation deals with modeling protein structural transitions as a phase transition. Using finite size scaling theory, the nature of unfolding transition of a four-helix bundle protein was investigated and critical exponents for the transition were calculated for various hydrophobic strengths in the core. It is found that the order of the transition changes from first to higher order as the strength of the hydrophobic interaction in the core region is significantly increased. Finally, a detailed kinetic and thermodynamic analysis was carried out in a model two-helix bundle. The connection between the structural free-energy landscape and folding kinetics was quantified. I show how simple protein engineering, by changing the hydropathy of a small number of amino acids, can enhance protein folding by significantly changing the free energy landscape so that kinetic traps are removed. The results have general applicability in protein engineering as well as understanding the underlying physical mechanisms of protein folding. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of this study was to explain the extent to which theoretically effective teaching strategies taught in a course on generic instructional strategies are being implemented by teachers in their actual teaching practice. ^ A multivariate causal-comparative (ex-post-facto) design was used to answer the research question. A teacher observation protocol, the General Instructional Strategies Analysis (GISA) was constructed and used to assess the utilization of instructional strategies in the classroom. The data of this study also included open-ended field notes taken during observations. ^ Multivariate Analyses of Variance (MANOVA) was used to compare the teaching strategies (set, effective explanation, hands-on activity, cooperative learning activity, higher order questioning, closure) of the group who had taken a general instructional strategies course (N=36) and the group who had not (N=36). Results showed a statistically significant difference between the two groups. The group who had taken the course implemented these strategies more effectively in almost all categories of effective teaching. Follow-up univariate tests of the dependent variables showed significant differences between the two groups in five of the six areas (hands-on activity being an exception). A second MANOVA compared the two groups on the effective use of attending behaviors (teacher movement/eye contact/body language/physical space, brief verbal acknowledgements/voice inflection/modulation/pitch, use of visuals, prompting/probing, praise/feedback/rewards, wait-time I and II). Results also showed a multivariate difference between the two groups. Follow-up univariate tests on the related dependent variables showed that five of the six were significantly different between the two groups. The group who had taken the course implemented the strategies more effectively. An analysis of the field notes provided further evidence regarding the pervasiveness of these differences between the teaching practices of the two groups. ^ It was concluded that taking a course in general instructional strategies increases the utilization of effective strategies in the classroom by teachers. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigated group processes as potential mediators or moderators of positive development outcome and negative reduction intervention response by evaluating the utility of a group measure modified from a widely known measure of group impact found in the group therapy research literature. Four group processes were of primary interest, (1) Group Impact; (2) Facilitator Impact; (3) Skills Impact; and (4) Exploration Impact as assessed by the Session Evaluation Form (SEF). Outcome measures included the Personally Expressive Activities Questionnaire (PEAQ), Erikson Psycho-Social Index (EPSI) and the Zill Behavior Items, Behavior Problem Index (ZBI (BPI)). The sample consisted of 121 multi-ethnic participants drawn from four alternative high schools from the Miami-Dade County Public School system. Utilizing a Latent Growth Curve Modeling approach with Structural Equation Modeling (SEM) statistics, preliminary analyses were conducted to evaluate the psychometric properties of the SEF and its role in the mediation or moderation of intervention outcome. Preliminary results revealed evidence of a single higher order factor representing a "General" global reaction, which was hypothesized to be a "Positive Group Climate" construct to the program as opposed to the four distinct group processes that were initially hypothesized to affect outcomes. The results of the evaluation of the mediation or moderation role of intervention outcome of the single "General" global latent factor ("Positive Group Climate" construct) did not significantly predict treatment response on any of the outcome variables. Nevertheless, the evidence of an underlying "General" global latent factor ("Positive Group Climate" construct) has important future directions for research on positive youth development programs as well as in group therapy research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as ƒ-test is performed during each node's split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-stakes testing and accountability have infiltrated the education system in the United States; the top priority for all teachers must be student progress on standardized tests. This has resulted in the predominance of reading for test-taking, (efferent reading), in the English, language arts, and reading classrooms. Authentic uses of print activities, like aesthetic reading, that encourage students to engage individually with a text, have been pushed aside. ^ During a 3-week time period, regular level, English 3/American literature students in a Title I magnet high school, participated in this quasi-experimental study (N = 62). It measured the effects of an intervention of reading American literature texts aesthetically and writing aesthetically-evoked reader responses on students' self-efficacy beliefs regarding their comprehension of American literature. One trained teacher and the researcher participated in the study: student participants were pre- and post- tested using the Confidence in Reading American Literature Survey which examined their self-efficacy beliefs regarding their comprehension of American literature. Several statistical analyses were performed. The results of the linear regression analyses partially supported a positive relationship between aesthetically-evoked reader responses and students' self-efficacy beliefs regarding their comprehension of American literature. Additionally, the results of the 2 (sex) x 2 (treatment) ANCOVAs conducted to test group differences in self-efficacy beliefs regarding the comprehension of American literature between treatment and control groups indicated a main effect for treatment (but not sex; nor was there a significant sex x treatment interaction), suggesting the treatment was partially effective in increasing students' self-efficacy beliefs. Seven of the twelve ANCOVAs indicated a statistically significant increase in the treatment group's adjusted group mean self-efficacy belief scores as a result of being exposed to the intervention. In six of these seven analyses, increases in self-efficacy beliefs occurred in tasks that required three or more higher-order levels of thinking/learning. The results are discussed in terms of theoretical, empirical and practical significance. Future research is recommended to extend the intervention beyond the narrow confines of a Title I magnet school to settings where the intervention could be tested longitudinally, e. g., honors and gifted students, elementary and middle schools.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The examination of Workplace Aggression as a global construct conceptualization has gained considerable attention over the past few years as organizations work to better understand and address the occurrence and consequences of this challenging construct. The purpose of this dissertation is to build on previous efforts to validate the appropriateness and usefulness of a global conceptualization of the workplace aggression construct. This dissertation has been broken up into two parts: Part 1 utilized a Confirmatory Factor Analysis approach in order to assess the existence of workplace aggression as a global construct; Part 2 utilized a series of correlational analyses to examine the relationship between a selection of commonly experienced individual strain based outcomes and the global construct conceptualization assessed in Part 1. Participants were a diverse sample of 219 working individuals from Amazon’s Mechanical Turk participant pool. Results of Part 1 did not show support for a one-factor global construct conceptualization of the workplace aggression construct. However, support was shown for a higher-order five-factor model of the construct, suggesting that it may be possible to conceptualize workplace aggression as an overarching construct that is made up of separate workplace aggression constructs. Results of Part 2 showed support for the relationships between an existing global construct workplace aggression conceptualization and a series of strain-based outcomes. Utilizing correlational analyses, additional post-hoc analyses showed that individual factors such as emotional intelligence and personality are related to the experience of workplace aggression. Further, utilizing moderated regression analysis, the results demonstrated that individuals experiencing high levels of workplace aggression reported higher job satisfaction when they felt strongly that the aggressive act was highly visible, and similarly, when they felt that there was a clear intent to cause harm. Overall, the findings of this dissertation do support the need for a simplification of its current state of measurement. Future research should continue to examine workplace aggression in an effort to shed additional light on the structure and usefulness of this complex construct.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to determine the knowledge and use of critical thinking teaching strategies by full-time and part-time faculty in Associate Degree Nursing (ADN) programs. Sander's CTI (1992) instrument was adapted for this study and pilottested prior to the general administration to ADN faculty in Southeast Florida. This modified instrument, now termed the Burroughs Teaching Strategy Inventory (BTSI), returned reliability estimates (Cronbach alphas of .71, .74, and .82 for the three constructs) comparable to the original instrument. The BTSI was administered to 113 full-time and part-time nursing faculty in three community college nursing programs. The response rate was 92% for full-time faculty (n = 58) and 61 % for part-time faculty (n = 55). The majority of participants supported a combined definition of critical thinking in nursing which represented a composite of thinking skills that included reflective thinking, assessing alternative viewpoints, and the use of problem-solving. Full-time and part-time faculty used different teaching strategies. Fulltime faculty most often used multiple-choice exams and lecture while part-time faculty most frequently used discussion within their classes. One possible explanation for specific strategy choices and differences might be that full-time faculty taught predominately in theory classes where certain strategies would be more appropriate and part-time faculty taught predominately clinical classes. Both faculty types selected written nursing care plans as the second most effective critical thinking strategy. Faculty identified several strategies as being effective in teaching critical thinking. These strategies included discussion, case studies, higher order questioning, and concept analysis. These however, were not always the strategies that were used in either the classroom or clinical setting. Based on this study, the author recommends that if the profession continues to stress critical thinking as a vital component of practice, nursing faculty should receive education in appropriate critical teaching strategies. Both in-service seminars and workshops could be used to further the knowledge and use of critical thinking strategies by faculty. Qualitative research should be done to determine why nursing faculty use self-selected teaching strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The examination of Workplace Aggression as a global construct conceptualization has gained considerable attention over the past few years as organizations work to better understand and address the occurrence and consequences of this challenging construct. The purpose of this dissertation is to build on previous efforts to validate the appropriateness and usefulness of a global conceptualization of the workplace aggression construct. This dissertation has been broken up into two parts: Part 1 utilized a Confirmatory Factor Analysis approach in order to assess the existence of workplace aggression as a global construct; Part 2 utilized a series of correlational analyses to examine the relationship between a selection of commonly experienced individual strain based outcomes and the global construct conceptualization assessed in Part 1. Participants were a diverse sample of 219 working individuals from Amazon’s Mechanical Turk participant pool. Results of Part 1 did not show support for a one-factor global construct conceptualization of the workplace aggression construct. However, support was shown for a higher-order five-factor model of the construct, suggesting that it may be possible to conceptualize workplace aggression as an overarching construct that is made up of separate workplace aggression constructs. Results of Part 2 showed support for the relationships between an existing global construct workplace aggression conceptualization and a series of strain-based outcomes. Utilizing correlational analyses, additional post-hoc analyses showed that individual factors such as emotional intelligence and personality are related to the experience of workplace aggression. Further, utilizing moderated regression analysis, the results demonstrated that individuals experiencing high levels of workplace aggression reported higher job satisfaction when they felt strongly that the aggressive act was highly visible, and similarly, when they felt that there was a clear intent to cause harm. Overall, the findings of this dissertation do support the need for a simplification of its current state of measurement. Future research should continue to examine workplace aggression in an effort to shed additional light on the structure and usefulness of this complex construct.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional Optics has provided ways to compensate some common visual limitations (up to second order visual impairments) through spectacles or contact lenses. Recent developments in wavefront science make it possible to obtain an accurate model of the Point Spread Function (PSF) of the human eye. Through what is known as the "Wavefront Aberration Function" of the human eye, exact knowledge of the optical aberration of the human eye is possible, allowing a mathematical model of the PSF to be obtained. This model could be used to pre-compensate (inverse-filter) the images displayed on computer screens in order to counter the distortion in the user's eye. This project takes advantage of the fact that the wavefront aberration function, commonly expressed as a Zernike polynomial, can be generated from the ophthalmic prescription used to fit spectacles to a person. This allows the pre-compensation, or onscreen deblurring, to be done for various visual impairments, up to second order (commonly known as myopia, hyperopia, or astigmatism). The technique proposed towards that goal and results obtained using a lens, for which the PSF is known, that is introduced into the visual path of subjects without visual impairment will be presented. In addition to substituting the effect of spectacles or contact lenses in correcting the loworder visual limitations of the viewer, the significance of this approach is that it has the potential to address higher-order abnormalities in the eye, currently not correctable by simple means.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information extraction is a frequent and relevant problem in digital signal processing. In the past few years, different methods have been utilized for the parameterization of signals and the achievement of efficient descriptors. When the signals possess statistical cyclostationary properties, the Cyclic Autocorrelation Function (CAF) and the Spectral Cyclic Density (SCD) can be used to extract second-order cyclostationary information. However, second-order cyclostationary information is poor in nongaussian signals, as the cyclostationary analysis in this case should comprise higher-order statistical information. This paper proposes a new mathematical tool for the higher-order cyclostationary analysis based on the correntropy function. Specifically, the cyclostationary analysis is revisited focusing on the information theory, while the Cyclic Correntropy Function (CCF) and Cyclic Correntropy Spectral Density (CCSD) are also defined. Besides, it is analytically proven that the CCF contains information regarding second- and higher-order cyclostationary moments, being a generalization of the CAF. The performance of the aforementioned new functions in the extraction of higher-order cyclostationary characteristics is analyzed in a wireless communication system where nongaussian noise exists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The monoaromatic compounds are toxic substances present in petroleum derivades and used broadly in the chemical and petrochemical industries. Those compounds are continuously released into the environment, contaminating the soil and water sources, leading to the possible unfeasibility of those hydrous resources due to their highly carcinogenic and mutagenic potentiality, since even in low concentrations, the BTEX may cause serious health issues. Therefore, it is extremely important to develop and search for new methodologies that assist and enable the treatment of BTEX-contaminated matrix. The bioremediation consists on the utilization of microbial groups capable of degrading hydrocarbons, promoting mineralization, or in other words, the permanent destruction of residues, eliminating the risks of future contaminations. This work investigated the biodegradation kinetics of water-soluble monoaromatic compounds (benzene, toluene and ethylbenzene), based on the evaluation of its consummation by the Pseudomonas aeruginosa bacteria, for concentrations varying from 40 to 200 mg/L. To do so, the performances of Monod kinetic model for microbial growth were evaluated and the material balance equations for a batch operation were discretized and numerically solved by the fourth order Runge-Kutta method. The kinetic parameters obtained using the method of least squares as statistical criteria were coherent when compared to those obtained from the literature. They also showed that, the microorganism has greater affinity for ethylbenzene. That way, it was possible to observe that Monod model can predict the experimental data for the individual biodegradation of the BTEX substrates and it can be applied to the optimization of the biodegradation processes of toxic compounds for different types of bioreactors and for different operational conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The monoaromatic compounds are toxic substances present in petroleum derivades and used broadly in the chemical and petrochemical industries. Those compounds are continuously released into the environment, contaminating the soil and water sources, leading to the possible unfeasibility of those hydrous resources due to their highly carcinogenic and mutagenic potentiality, since even in low concentrations, the BTEX may cause serious health issues. Therefore, it is extremely important to develop and search for new methodologies that assist and enable the treatment of BTEX-contaminated matrix. The bioremediation consists on the utilization of microbial groups capable of degrading hydrocarbons, promoting mineralization, or in other words, the permanent destruction of residues, eliminating the risks of future contaminations. This work investigated the biodegradation kinetics of water-soluble monoaromatic compounds (benzene, toluene and ethylbenzene), based on the evaluation of its consummation by the Pseudomonas aeruginosa bacteria, for concentrations varying from 40 to 200 mg/L. To do so, the performances of Monod kinetic model for microbial growth were evaluated and the material balance equations for a batch operation were discretized and numerically solved by the fourth order Runge-Kutta method. The kinetic parameters obtained using the method of least squares as statistical criteria were coherent when compared to those obtained from the literature. They also showed that, the microorganism has greater affinity for ethylbenzene. That way, it was possible to observe that Monod model can predict the experimental data for the individual biodegradation of the BTEX substrates and it can be applied to the optimization of the biodegradation processes of toxic compounds for different types of bioreactors and for different operational conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To investigate the operation of the Shin-Nippon/Grand Seiko autorefractor and whether higher-order aberrations affect its peripheral refraction measurements. METHODS: Information on instrument design, together with parameters and equations used to obtain refraction, was obtained from a patent. A model eye simulating the operating principles was tested with an optical design program. Effects of induced defocus and astigmatism on the retinal image were used to calibrate the model eye to match the patent equations. Coma and trefoil were added to assess their effects on the image. Peripheral refraction of a physical model eye was measured along four visual field meridians with the Shin-Nippon/Grand Seiko autorefractor SRW-5000 and a Hartmann-Shack aberrometer, and simulated autorefractor peripheral refraction was derived using the Zernike coefficients from the aberrometer. RESULTS: In simulation, the autorefractor's square image was changed in size by defocus, into rectangles or parallelograms by astigmatism, and into irregular shapes by coma and trefoil. In the presence of 1.0 D oblique astigmatism, errors in refraction were proportional to the higher-order aberrations, with up to 0.8 D sphere and 1.5 D cylinder for ±0.6 μm of coma or trefoil coefficients with a 5-mm-diameter pupil. For the physical model eye, refraction with the aberrometer was similar in all visual field meridians, but refraction with the autorefractor changed more quickly along one oblique meridian and less quickly along the other oblique meridian than along the horizontal and vertical meridians. Simulations predicted that higher-order aberrations would affect refraction in oblique meridians, and this was supported by the experimental measurements with the physical model eye. CONCLUSIONS: The autorefractor's peripheral refraction measurements are valid for horizontal and vertical field meridians, but not for oblique field meridians. Similar instruments must be validated before being adopted outside their design scope.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compensation of the detrimental impacts of nonlinearity on long-haul wavelength division multiplexed system performance is discussed, and the difference between transmitter, receiver and in-line compensation analyzed. We demonstrate that ideal compensation of nonlinear noise could result in an increase in the signal-to-noise ratio (measured in dB) of 50%, and that reaches may be more than doubled for higher order modulation formats. The influence of parametric noise amplification is discussed in detail, showing how increased numbers of optical phase conjugators may further increase the received signal-tonoise ratio. Finally the impact of practical real world system imperfections, such as polarization mode dispersion, are outlined.