943 resultados para Fourth order method
Resumo:
2010 Mathematics Subject Classification: Primary 35J70; Secondary 35J15, 35D05.
Resumo:
In this paper, we report on a new method to cleave polymer optical fibre. The most common way to cut a polymer optical fibre is chopping it with a razor blade; however, in this approach both the fibre and the blade must be preheated in order to turn the material ductile, and thus, prevent crazing. In this paper, we make use of the temperature-time equivalence in polymers to replace the use of heating by an increase of the cleaving time and use a sawing motion to reduce fibre end face damage. In this way, the polymer fibre can be cleaved at room temperature in seconds with the resulting end face being of similar quality to those produced by more complex and expensive heated systems.
Resumo:
We propose and experimentally demonstrate a new method to extend the range of Brillouin optical time domain analysis (BOTDA) systems. It exploits the virtual transparency created by second-order Raman pumping in optical fibers. The idea is theoretically analyzed and experimentally demonstrated in a 50 km fiber. By working close to transparency, we also show that the measurement length of the BOTDA can be increased up to 100 km with 2 meter resolution. We envisage extensions of this technique to measurement lengths well beyond this value, as long as the issue of relative intensity noise (RIN) of the primary Raman pump can be avoided. © 2010 Optical Society of America.
Resumo:
Extra-care housing has been an important and growing element of housing and care for older people in the United Kingdom since the 1990s. Previous studies have examined specific features and programmes within extra-care locations, but few have studied how residents negotiate social life and identity. Those that have, have noted that while extra care brings many health-related and social benefits, extra-care communities can also be difficult affective terrain. Given that many residents are now ‘ageing in place’ in extra care, it is timely to revisit these questions of identity and affect. Here we draw on the qualitative element of a three-year, mixed-method study of 14 extra-care villages and schemes run by the ExtraCare Charitable Trust. We follow Alemàn in regarding residents' ambivalent accounts of life in ExtraCare as important windows on the way in which liminal residents negotiate the dialectics of dependence and independence. However, we suggest that the dialectic of interest here is that of the third and fourth age, as described by Gilleard and Higgs. We set that dialectic within a post-structuralist/Lacanian framework in order to examine the different modes of enjoyment that liminal residents procure in ExtraCare's third age public spaces and ideals, and suggest that their complaints can be read in three ways: as statements about altered material conditions; as inter-subjective bolstering of group identity; and as fantasmatic support for liminal identities. Finally, we examine the implications that this latter psycho-social reading of residents' complaints has for enhancing and supporting residents' wellbeing.
Resumo:
This research provides data which investigates the feasibility of using fourth generation evaluation during the process of instruction. A semester length course entitled "Multicultural Communications", (PUR 5406/4934) was designed and used in this study, in response to the need for the communications profession to produce well-trained culturally sensitive practitioners for the work force and the market place. A revised pause model consisting of three one-on-one indepth interviews conducted outside of the class, three reflections periods during the class and a self-reflective essay prepared one week before the end of the course was analyzed. Narrative and graphic summaries of participant responses produced significant results. The revised pause model was found to be an effective evaluation method for use in multicultural education under certain conditions as perceived by the participants in the study. participant self-perceived behavior change and knowledge acquisition was identified through use of the revised pause model. Study results suggest that by using the revised pause model of evaluation, instructors teaching multicultural education in schools of journalism and mass communication is yet another way of enhancing their ability to become both the researcher and the research subject. In addition, the introduction of a qualitative model has been found to be a more useful way of generating participant involvement and introspection. Finally, the instructional design of the course used in the study provides communication educators with a practical way of preparing their students be effective communicators in a multicultural world.
Resumo:
This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.
Resumo:
Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).
Resumo:
This dissertation introduces a new system for handwritten text recognition based on an improved neural network design. Most of the existing neural networks treat mean square error function as the standard error function. The system as proposed in this dissertation utilizes the mean quartic error function, where the third and fourth derivatives are non-zero. Consequently, many improvements on the training methods were achieved. The training results are carefully assessed before and after the update. To evaluate the performance of a training system, there are three essential factors to be considered, and they are from high to low importance priority: (1) error rate on testing set, (2) processing time needed to recognize a segmented character and (3) the total training time and subsequently the total testing time. It is observed that bounded training methods accelerate the training process, while semi-third order training methods, next-minimal training methods, and preprocessing operations reduce the error rate on the testing set. Empirical observations suggest that two combinations of training methods are needed for different case character recognition. Since character segmentation is required for word and sentence recognition, this dissertation provides also an effective rule-based segmentation method, which is different from the conventional adaptive segmentation methods. Dictionary-based correction is utilized to correct mistakes resulting from the recognition and segmentation phases. The integration of the segmentation methods with the handwritten character recognition algorithm yielded an accuracy of 92% for lower case characters and 97% for upper case characters. In the testing phase, the database consists of 20,000 handwritten characters, with 10,000 for each case. The testing phase on the recognition 10,000 handwritten characters required 8.5 seconds in processing time.
Resumo:
This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.
Resumo:
There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.
Resumo:
Background Sucralose has gained popularity as a low calorie artificial sweetener worldwide. Due to its high stability and persistence, sucralose has shown widespread occurrence in environmental waters, at concentrations that could reach up to several μg/L. Previous studies have used time consuming sample preparation methods (offline solid phase extraction/derivatization) or methods with rather high detection limits (direct injection) for sucralose analysis. This study described a faster and sensitive analytical method for the determination of sucralose in environmental samples. Results An online SPE-LC–MS/MS method was developed, being capable to quantify sucralose in 12 minutes using only 10 mL of sample, with method detection limits (MDLs) of 4.5 ng/L, 8.5 ng/L and 45 ng/L for deionized water, drinking and reclaimed waters (1:10 diluted with deionized water), respectively. Sucralose was detected in 82% of the reclaimed water samples at concentrations reaching up to 18 μg/L. The monthly average for a period of one year was 9.1 ± 2.9 μg/L. The calculated mass loads per capita of sucralose discharged through WWTP effluents based on the concentrations detected in wastewaters in the U. S. is 5.0 mg/day/person. As expected, the concentrations observed in drinking water were much lower but still relevant reaching as high as 465 ng/L. In order to evaluate the stability of sucralose, photodegradation experiments were performed in natural waters. Significant photodegradation of sucralose was observed only in freshwater at 254 nm. Minimal degradation (<20%) was observed for all matrices under more natural conditions (350 nm or solar simulator). The only photolysis product of sucralose identified by high resolution mass spectrometry was a de-chlorinated molecule at m/z 362.0535, with molecular formula C12H20Cl2O8. Conclusions Online SPE LC-APCI/MS/MS developed in the study was applied to more than 100 environmental samples. Sucralose was frequently detected (>80%) indicating that the conventional treatment process employed in the sewage treatment plants is not efficient for its removal. Detection of sucralose in drinking waters suggests potential contamination of surface and ground waters sources with anthropogenic wastewater streams. Its high resistance to photodegradation, minimal sorption and high solubility indicate that sucralose could be a good tracer of anthropogenic wastewater intrusion into the environment.
Resumo:
This study explored the topic of motivation for intermediate students combining both an objective criterion measure (i.e., standardized test scores) and the self-report of students on self-concept and value of reading. The purpose of this study was to examine how third grade reading achievement correlated with the motivation of fourth grade boys and girls, and, in turn, how motivation related to fourth grade reading achievement. The participants were fourth grade students (n=207) attending two public, elementary schools in Miami-Dade County who were of primarily Hispanic origin or descent. Data were collected using the Reading Survey portion of the Motivation to Read Profile (1996) which measures self-concept and value of reading in order to measure motivation and the Third and Fourth Grade Reading Florida Comprehensive Assessment Tests 2.0 (FCAT 2.0) to assess achievement. First, a one way Analysis of Variance (ANOVA) was conducted to determine whether motivation differed significantly between fourth grade boys and girls. Second, a path analysis was used to determine whether motivation mediated or moderated the association between FCAT 2.0 third and fourth grade scores. Results of the ANOVA indicated that motivation, as measured by the Motivation to Read Profile did not differ significantly by sex. Results from the path analysis indicated that the model was significant and that third grade FCAT 2.0 scores accounted for a significant amount of the variance in fourth grade FCAT 2.0 scores once motivation was entered. Results of the study demonstrated that motivation partially mediates, but does not moderate the relationship between FCAT 2.0 third and fourth grade scores. In conclusion, it can be determined that past student achievement for fourth grade students plays a role in current student achievement when motivation is also considered. It is therefore important in order to improve the quality of fourth grade student's current performance to take into account a student's motivation and past achievement. An effort must be made to address students' motivational needs whether through school wide programs or at the classroom level in addition or in conjunction with cognition. Future research on the effect of self-concept in reading achievement is recommended.
Resumo:
The goal of this project was to develop a rapid separation and detection method for analyzing organic compounds in smokeless powders and then test its applicability on gunshot residue (GSR) samples. In this project, a total of 20 common smokeless powder additives and their decomposition products were separated by ultra performance liquid chromatography (UPLC) and confirmed by tandem mass spectrometry (MS/MS) using multiple reaction monitoring mode (MRM). Some of the targeted compounds included diphenylamines, centralites, nitrotoluenes, nitroglycerin, and various phthalates. The compounds were ionized in the MS source using simultaneous positive and negative electrospray ionization (ESI) with negative atmospheric pressure chemical ionization (APCI) in order to detect all compounds in a single analysis. The developed UPLC/MS/MS method was applied to commercially available smokeless powders and gunshot residue samples recovered from the hands of shooters, spent cartridges, and smokeless powder retrieved from unfired cartridges. Distinct compositions were identified for smokeless powders from different manufacturers and from separate manufacturing lots. The procedure also produced specific chemical profiles when tested on gunshot residues from different manufacturers. Overall, this thesis represents the development of a rapid and reproducible procedure capable of simultaneously detecting the widest possible range of components present in organic gunshot residue.^
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^
Resumo:
Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).