995 resultados para Theses and Dissertation Repositories
Resumo:
Using inert gas condensation techniques the properties of sputtered neodymium-iron-born clusters were investigated. A D.C. magnetron sputtering source created vaporous Nd-Fe-B which was then condensed into clusters and deposited onto silicon substrates. A composite target of Nd-Fe-B discs on an iron plate and a composite target of Nd-(Fe-Co)-B were utilized to create clusters. The clusters were coated with a carbon layer through R.F. sputtering to prevent oxidation. Samples were investigated in the TEM and showed a size distribution with an average particle diameter of 8.11 nm. The clusters, upon deposition, were amorphous as indicated by diffuse diffraction patterns obtained through SAD. The EDS showed compositionally a direct correlation in the ratio of rare-earth to transition metals between the target and deposited samples. The magnetic properties of the as-deposited clusters showed superparamagnetic properties at high temperatures and ferromagnetic properties at low temperatures; these properties are indicative of rare-earth transition metal amorphous clusters. Annealing of samples showed an initial increase in the coercivity. Samples were annealed in an inert gas atmosphere at 600o C for increasing amounts of time. The samples showed an initial increase in coercivity, but showed no additional increases with additional annealing time. SAD of annealed cluster samples showed the presence of Nd2Fe17 and a bcc-Nd phase. The bcc-Nd is the result of oxidation at high temperatures created during annealing and surface interface energy. The magnetic properties of the annealed samples showed weak coercivity and a saturation magnetization equivalent to that of Nd2Fe17. The annealed clusters showed a slight increase in coercivity at low temperatures. These results indicate a loss of boron during the sputtering process.
Resumo:
The elimination of all external incisions is an important step in reducing the invasiveness of surgical procedures. Natural Orifice Translumenal Endoscopic Surgery (NOTES) is an incision-less surgery and provides explicit benefits such as reducing patient trauma and shortening recovery time. However, technological difficulties impede the widespread utilization of the NOTES method. A novel robotic tool has been developed, which makes NOTES procedures feasible by using multiple interchangeable tool tips. The robotic tool has the capability of entering the body cavity through an orifice or a single incision using a flexible articulated positioning mechanism and once inserted is not constrained by incisions, allowing for visualization and manipulations throughout the cavity. Multiple interchangeable tool tips of the robotic device initially consist of three end effectors: a grasper, scissors, and an atraumatic Babcock clamp. The tool changer is capable of selecting and switching between the three tools depending on the surgical task using a miniature mechanism driven by micro-motors. The robotic tool is remotely controlled through a joystick and computer interface. In this thesis, the following aspects of this robotic tool will be detailed. The first-generation robot is designed as a conceptual model for implementing a novel mechanism of switching, advancing, and controlling the tool tips using two micro-motors. It is believed that this mechanism achieves a reduction in cumbersome instrument exchanges and can reduce overall procedure time and the risk of inadvertent tissue trauma during exchanges with a natural orifice approach. Also, placing actuators directly at the surgical site enables the robot to generate sufficient force to operate effectively. Mounting the multifunctional robot on the distal end of an articulating tube provides freedom from restriction on the robot kinematics and helps solve some of the difficulties otherwise faced during surgery using NOTES or related approaches. The second-generation multifunctional robot is then introduced in which the overall size is reduced and two arms provide 2 additional degrees of freedom, resulting in feasibility of insertion through the esophagus and increased dexterity. Improvements are necessary in future iterations of the multifunctional robot; however, the work presented is a proof of concept for NOTES robots capable of abdominal surgical interventions.
Resumo:
Blast traumatic brain injury (BTBI) has become an important topic of study because of the increase of such incidents, especially due to the recent growth of improvised explosive devices (IEDs). This thesis discusses a project in which laboratory testing of BTBI was made possible by performing blast loading on experimental models simulating the human head. Three versions of experimental models were prepared – one having a simple geometry and the other two having geometry similar to a human head. For developing the head models, three important parts of the head were considered for material modeling and analysis – the skin, skull and brain. The materials simulating skin, skull and brain went through many testing procedures including dynamic mechanical analysis (DMA). For finding a suitable brain simulant, several materials were tested under low and high frequencies. Step response analysis, rheometry and DMA tests were performed on materials such as water based gels, oil based mixtures and silicone gels cured at different temperatures. The gelatins and silicone gels showed promising results toward their use as brain surrogate materials. Temperature degradation tests were performed on gelatins, indicating the fast degradation of gelatins at room temperature. Silicone gels were much more stable compared to the water based gels. Silicone gels were further processed using a thinner-type additive gel to bring the dynamic modulus values closer to those of human brain matter. The obtained values from DMA were compared to the values for human brain as found in literature. Then a silicone rubber brain mold was prepared to give the brain model accurate geometry. All the components were put together to make the entire head model. A steel mount was prepared to attach the head for testing at the end of the shock tube. Instrumentation was implemented in the head model to obtain effective results for understanding more about the possible mechanisms of BTBI. The final head model was named the Realistic Explosive Dummy Head or the “RED Head.” The RED Head offered potential for realistic experimental testing in blast loading conditions by virtue of its material properties and geometrical accuracy.
Resumo:
Evaluations of measurement invariance provide essential construct validity evidence. However, the quality of such evidence is partly dependent upon the validity of the resulting statistical conclusions. The presence of Type I or Type II errors can render measurement invariance conclusions meaningless. The purpose of this study was to determine the effects of categorization and censoring on the behavior of the chi-square/likelihood ratio test statistic and two alternative fit indices (CFI and RMSEA) under the context of evaluating measurement invariance. Monte Carlo simulation was used to examine Type I error and power rates for the (a) overall test statistic/fit indices, and (b) change in test statistic/fit indices. Data were generated according to a multiple-group single-factor CFA model across 40 conditions that varied by sample size, strength of item factor loadings, and categorization thresholds. Seven different combinations of model estimators (ML, Yuan-Bentler scaled ML, and WLSMV) and specified measurement scales (continuous, censored, and categorical) were used to analyze each of the simulation conditions. As hypothesized, non-normality increased Type I error rates for the continuous scale of measurement and did not affect error rates for the categorical scale of measurement. Maximum likelihood estimation combined with a categorical scale of measurement resulted in more correct statistical conclusions than the other analysis combinations. For the continuous and censored scales of measurement, the Yuan-Bentler scaled ML resulted in more correct conclusions than normal-theory ML. The censored measurement scale did not offer any advantages over the continuous measurement scale. Comparing across fit statistics and indices, the chi-square-based test statistics were preferred over the alternative fit indices, and ΔRMSEA was preferred over ΔCFI. Results from this study should be used to inform the modeling decisions of applied researchers. However, no single analysis combination can be recommended for all situations. Therefore, it is essential that researchers consider the context and purpose of their analyses.
Resumo:
Trade liberalization policies in Guatemala have impacted agricultural production. This thesis focuses on how trade liberalization has happened, what have been the impacts at a national level and describes how a community has adapted to the implementation of these policies. The implementation of trade was influenced by several, international and national institutions. Among the international institutions are the World Bank, the World Trade Organization and the United States Agency for International Development. At the national level the institutions that have partaken in shaping the trade policies are the military and the owners of capital and labor. The implementation of trade policies at a national level has affected national corn prices, population level diets and to some extent reduced poverty levels. At a local level trade liberalization policies have impacted land holdings, increased intensification of agriculture, including agrochemical, machinery and crop plantations per year, and consumption rates of corn have been affected. Maximization of the benefits and minimization of the detrimental effects can happen with the implementation of policies that promote food security, improve access to health and education, and prevent environmental and human health consequences from the intensification of agriculture and at the same time continue with the production of non-traditional agricultural products.
Resumo:
ACADEMIC CONTENTS: Digital Library of Theses and Dissertations, eAULAS, Open Educational Resources. SCIENTIFIC CONTENTS: Digital Library of USP Intellectual Production, Scientific Journals Portal. OTHER CONTENTS: Rare books, Maps, Images.
Resumo:
Stone Age research on Northern Europe frequently makes gross generalizations about the Mesolithic and Neolithic, although we still lack much basic knowledge on how the people lived. The transition from the Mesolithic to the Neolithic in Europe has been described as a radical shift from an economy dominated by marine resources to one solely dependent on farming. Both the occurrence and the geographical extent of such a drastic shift can be questioned, however. It is therefore important to start out at a more detailed level of evidence in order to present the overall picture, and to account for the variability even in such regional or chronological overviews. Fifteen Stone Age sites were included in this study, ranging chronologically from the Early Mesolithic to the Middle or Late Neolithic, c. 8300–2500 BC, and stretching geographically from the westernmost coast of Sweden to the easternmost part of Latvia within the confines of latitudes 55–59° N. The most prominent sites in terms of the number of human and faunal samples analysed are Zvejnieki, Västerbjers and Skateholm I–II. Human and faunal skeletal remains were subjected to stable carbon and nitrogen isotope analysis to study diet and ecology at the sites. Stable isotope analyses of human remains provide quantitative information on the relative importance of various food sources, an important addition to the qualitative data supplied by certain artefacts and structures or by faunal or botanical remains. A vast number of new radiocarbon dates were also obtained. In conclusion, a rich diversity in Stone Age dietary practice in the Baltic Region was demonstrated. Evidence ranging from the Early Mesolithic to the Late Neolithic show that neither chronology nor location alone can account for this variety, but that there are inevitably cultural factors as well. Food habits are culturally governed, and therefore we cannot automatically assume that people at similar sites will have the same diet. Stable isotope studies are very important here, since they tell us what people actually consumed, not only what was available, or what one single meal contained. We should not be deceived in inferring diet from ritually deposited remains, since things that were mentally important were not always important in daily life. Thus, although a ritual and symbolic norm may emphasize certain food categories, these may in fact contribute very little to the diet. By the progress of analysis of intra-individual variation, new data on life history changes have been produced, revealing mobility patterns, breastfeeding behaviour and certain dietary transitions. The inclusion of faunal data has proved invaluable for understanding the stable isotope ecology of a site, and thereby improve the precision of the interpretations of human stable isotope data. The special case of dogs, though, demonstrates that these animals are not useful for inferring human diet, since, due to the number of roles they possess in human society, dogs could deviate significantly from humans in their diet, and in several cases have been proved to do so. When evaluating radiocarbon data derived from human and animal remains from the Pitted-Ware site of Västerbjers on Gotland, the importance of establishing the stable isotope ecology of the site before making deductions on reservoir effects was further demonstrated. The main aim of this thesis has been to demonstrate the variation and diversity in human practices, challenging the view of a “monolithic” Stone Age. By looking at individuals and not only at populations, the whole range of human behaviour has been accounted for, also revealing discrepancies between norm and practice, which are frequently visible both in the archaeological record and in present-day human behaviour.
Resumo:
The overall aim of the present thesis was to develop and characterise an age assessment method based on incremental lines in dental cementum using contemporary bovine teeth and teeth from archaeological faunal assemblages. The investigations also included two other age assessment methods: tooth wear pattern and macroscopic dental measurements. The first permanent mandibular molar and lower jaws from 70 contemporary cattle of known age and 170 archaeological molar sets from ten different Swedish archaeological sites were used. The following conclusions were drawn: • The number of incremental lines in the dental cementum varied between different parts of the tooth root as well as within one and the same individual. The results from contemporary cattle of known age showed a strong relationship between age and incremental lines in the cementum of the distal part of the mesial root (R2=65.5%) and the known ages of the animals. • With the “best” model variation in age could be explained to 65.5% (R2) by the number of incremental lines. Thus, the remaining age variation (approximately 35%) could not be explained by these lines. Other factors than must thus be responsible. However, with the exception of calves born the present material did not reveal any such significant relationship. • The results from cattle of known age indicate that the method of assessing age on the basis of cemental incremental lines is more reliable than other methods such as tooth wear or tooth measurements. However, by combining counting incremental lines and one variable assessing tooth dimension (tooth height) a slightly stronger relationship could be obtained (R2=74.5%). The results from age assessment of the medieval and post-Reformation cattle emphasize the importance of supplementing any age estimation of archaeological assemblages based on dental indicators with characteristics for the particular assessment model. Furthermore, conclusions based on age assessment with such models can not be drawn with any more detailed time scale than about 2 years leaving at best only 25% (R2) of factors influencing the dental indicator(s) utilized in the model unexplained. The accuracy of the age assessment required by the particular historical context in which the archaeological remains are found should thus decide what level of accuracy should be chosen.
Resumo:
Post-soviet countries are in the process of transformation from a totalitarian order to a democratic one, a transformation which is impossible without a profound shift in people's way of thinking. The group set themselves the task of determining the essence of this shift. Using a multidisciplinary approach, they looked at concrete ways of overcoming the totalitarian mentality and forming that necessary for an open democratic society. They studied the contemporary conceptions of tolerance and critical thinking and looked for new foundations of criticism, especially in hermeneutics. They then sought to substantiate the complementary relation between tolerance and criticism in the democratic way of thinking and to prepare a a syllabus for teaching on the subject in Ukrainian higher education. In a philosophical exploration of tolerance they began with relgious tolerance as its first and most important form. Political and social interests often lay at the foundations of religious intolerance and this implicitly comprised the transition to religious tolerance when conditions changed. Early polytheism was more or less indifferent to dogmatic deviations but monotheism is intolerant of heresies. The damage wrought by the religious wars of the Reformations transformed tolerance into a value. They did not create religious tolerance but forced its recognition as a positive phenomenon. With the weakening of religious institutions in the modern era, the purely political nature of many conflicts became evident and this stimulated the extrapolation of tolerance into secular life. Each historical era has certain acts and operations which may be interpreted as tolerant and these can be classified as to whether or not they are based on the conscious following of the principle of tolerance. This criterion requires the separation of the phenomenon of tolerance from its concept and from tolerance as a value. Only the conjunction of a concept of tolerance with a recognition of its value can transform it into a principle dictating a norm of conscious behaviour. The analysis of the contemporary conception of tolerance focused on the diversity of the concept and concluded that the notions used cannot be combined in the framework of a single more or less simple classification, as the distinctions between them are stimulated by the complexity of the realty considered and the variety of its manifestations. Notions considered in relation to tolerance included pluralism, respect and particular-universal. The rationale of tolerance was also investigated and the group felt that any substantiation of the principle of tolerance must take into account human beings' desire for knowledge. Before respecting or being tolerant of another person different from myself, I should first know where the difference lies, so knowledge is a necessary condition of tolerance.The traditional division of truth into scientific (objective and unique) and religious, moral, political (subjective and so multiple) intensifies the problem of the relationship between truth and tolerance. Science was long seen as a field of "natural" intolerance whereas the validity of tolerance was accepted in other intellectual fields. As tolerance eemrges when there is difference and opposition, it is essentially linked with rivaly and there is a a growing recognition today that unlimited rivalry is neither able to direct the process of development nor to act as creative matter. Social and economic reality has led to rivalry being regulated by the state and a natural requirement of this is to associate tolerance with a special "purified" form of rivalry, an acceptance of the actiivity of different subjects and a specification of the norms of their competition. Tolerance and rivalry should therefore be subordinate to a degree of discipline and the group point out that discipline, including self-discipline, is a regulator of the balance between them. Two problematic aspects of tolerance were identified: why something traditionally supposed to have no positive content has become a human activity today, and whether tolerance has full-scale cultural significance. The resolution of these questions requires a revision of the phenomenon and conception of tolerance to clarify its immanent positive content. This involved an investigation of the contemporary concept of tolerance and of the epistemological foundations of a negative solution of tolerance in Greek thought. An original soution to the problem of the extrapolation of tolerance to scientific knowledge was proposed based on the Duhem-Quine theses and conceptiion of background knowledge. In this way tolerance as a principle of mutual relations between different scientific positions gains an essential epistemological rationale and so an important argument for its own universal status. The group then went on to consider the ontological foundations for a positive solution of this problem, beginning with the work of Poincare and Reichenbach. The next aspect considered was the conceptual foundations of critical thinking, looking at the ideas of Karl Popper and St. Augustine and at the problem of the demarcation line between reasonable criticism and apologetic reasoning. Dogmatic and critical thinking in a political context were also considered, before an investigation of critical thinking's foundations. As logic is essential to critical thinking, the state of this discipline in Ukrainian and Russian higher education was assessed, together with the limits of formal-logical grounds for criticism, the role of informal logical as a basis for critical thinking today, dialectical logic as a foundation for critical thinking and the universality of the contemporary demand for criticism. The search for new foundations of critical thinking covered deconstructivism and critical hermeneutics, including the problem of the author. The relationship between tolerance and criticism was traced from the ancient world, both eastern and Greek, through the transitional community of the Renaissance to the industrial community (Locke and Mill) and the evolution of this relationship today when these are viewed not as moral virtues but as ordinary norms. Tolerance and criticism were discussed as complementary manifestations of human freedom. If the completeness of freedom were accepted it would be impossible to avoid recognition of the natural and legal nature of these manifestations and the group argue that critical tolerance is able to avoid dismissing such negative phenomena as the degradation of taste and manner, pornography, etc. On the basis of their work, the group drew up the syllabus of a course in "Logic with Elements of Critical Thinking, and of a special course on the "Problem of Tolerance".
Resumo:
Thermo-responsive materials have been of interest for many years, and have been studied mostly as thermally stimulated drug delivery vehicles. Recently acrylate and methacrylates with pendant ethylene glycol methyl ethers been studied as thermo responsive materials. This work explores thermo response properties of hybrid nanoparticles of one of these methacrylates (DEGMA) and a block copolymer with one of the acrylates (OEGA), with gold nanoparticle cores of different sizes. We were interested in the effects of gold core size, number and type of end groups that anchored the chains to the gold cores, and location of bonding sites on the thermo-response of the polymer. To control the number and location of anchoring groups we using a type of controlled radical polymerization called Reversible Addition Fragmentation Transfer (RAFT) Polymerization. Smaller gold cores did not show the thermo responsive behavior of the polymer but the gold cores did seem to self-assemble. Polymer anchored to larger gold cores did show thermo responsivity. The anchoring end group did not alter the thermoresponsivity but thiol-modified polymers stabilized gold cores less well than chains anchored by dithioester groups, allowing gold cores to grow larger. Use of multiple bonding groups stabilized the gold core. Using block copolymers we tested the effects of number of thiol groups and the distance between them. We observed that the use of multiple anchoring groups on the block copolymer with a sufficiently large gold core did not prevent thermo responsive behavior of the polymer to be detected which allows a new type of thermo-responsive hybrid nanoparticle to be used and studied for new applications.
Resumo:
Bioplastics are polymers (such as polyesters) produced from bacterial fermentations that are biodegradable and nonhazardous. They are produced by a wide variety of bacteria and are made only when stress conditions allow, such as when nutrient levels are low, more specifically levels of nitrogen and oxygen. These stress conditions cause certain bacteria to build up excess carbon deposits as energy reserves in the form of polyhydroxyalkanoates (PHAs). PHAs can be extracted and formed into actual plastic with the same strength of conventional, synthetic-based plastics without the need to rely on foreign petroleum. The overall goal of this project was to select for a bacteria that could grow on sugars found in the lignocellulosic biomass, and get the bacteria to produce PHAs and peptidoglycan. Once this was accomplished the goal was to extract PHAs and peptidoglycan in order to make a stronger more rigid plastic, by combing them into a co-polymer. The individual goals of this project were to: (1) Select and screen bacteria that are capable of producing PHAs by utilizing the carbon/energy sources found in lignocellulosic biomass; (2) Maximize the utilization of those sugars present in woody biomass in order to produce optimal levels of PHAs. (3) Use room temperature ionic liquids (RTILs) in order to separate the cell membrane and peptidoglycan, allowing for better extraction of PHAs and more intact peptidoglycan. B. megaterium a Gram-positive PHA-producing bacterium was selected for study in this project. It was grown on a variety of different substrates in order to maximize both its growth and production of PHAs. The optimal conditions were found to be 30°C, pH 6.0 and sugar concentration of either 30g/L glucose or xylose. After optimal growth was obtained, both RTILs and enzymatic treatments were used to break the cell wall, in order to extract the PHAs, and peptidoglycan. PHAs and peptidoglycan were successfully extracted from the cell, and will be used in the future to create a new stronger co-polymer. Peptidoglycan recovery yield was 16% of the cells’ dry weight.
Resumo:
Large parts of the world are subjected to one or more natural hazards, such as earthquakes, tsunamis, landslides, tropical storms (hurricanes, cyclones and typhoons), costal inundation and flooding. Virtually the entire world is at risk of man-made hazards. In recent decades, rapid population growth and economic development in hazard-prone areas have greatly increased the potential of multiple hazards to cause damage and destruction of buildings, bridges, power plants, and other infrastructure; thus posing a grave danger to the community and disruption of economic and societal activities. Although an individual hazard is significant in many parts of the United States (U.S.), in certain areas more than one hazard may pose a threat to the constructed environment. In such areas, structural design and construction practices should address multiple hazards in an integrated manner to achieve structural performance that is consistent with owner expectations and general societal objectives. The growing interest and importance of multiple-hazard engineering has been recognized recently. This has spurred the evolution of multiple-hazard risk-assessment frameworks and development of design approaches which have paved way for future research towards sustainable construction of new and improved structures and retrofitting of the existing structures. This report provides a review of literature and the current state of practice for assessment, design and mitigation of the impact of multiple hazards on structural infrastructure. It also presents an overview of future research needs related to multiple-hazard performance of constructed facilities.
Resumo:
For the past sixty years, waveguide slot radiator arrays have played a critical role in microwave radar and communication systems. They feature a well-characterized antenna element capable of direct integration into a low-loss feed structure with highly developed and inexpensive manufacturing processes. Waveguide slot radiators comprise some of the highest performance—in terms of side-lobe-level, efficiency, etc. — antenna arrays ever constructed. A wealth of information is available in the open literature regarding design procedures for linearly polarized waveguide slots. By contrast, despite their presence in some of the earliest published reports, little has been presented to date on array designs for circularly polarized (CP) waveguide slots. Moreover, that which has been presented features a classic traveling wave, efficiency-reducing beam tilt. This work proposes a unique CP waveguide slot architecture which mitigates these problems and a thorough design procedure employing widely available, modern computational tools. The proposed array topology features simultaneous dual-CP operation with grating-lobe-free, broadside radiation, high aperture efficiency, and good return loss. A traditional X-Slot CP element is employed with the inclusion of a slow wave structure passive phase shifter to ensure broadside radiation without the need for performance-limiting dielectric loading. It is anticipated this technology will be advantageous for upcoming polarimetric radar and Ka-band SatCom systems. The presented design methodology represents a philosophical shift away from traditional waveguide slot radiator design practices. Rather than providing design curves and/or analytical expressions for equivalent circuit models, simple first-order design rules – generated via parametric studies — are presented with the understanding that device optimization and design will be carried out computationally. A unit-cell, S-parameter based approach provides a sufficient reduction of complexity to permit efficient, accurate device design with attention to realistic, application-specific mechanical tolerances. A transparent, start-to-finish example of the design procedure for a linear sub-array at X-Band is presented. Both unit cell and array performance is calculated via finite element method simulations. Results are confirmed via good agreement with finite difference, time domain calculations. Array performance exhibiting grating-lobe-free, broadside-scanned, dual-CP radiation with better than 20 dB return loss and over 75% aperture efficiency is presented.
Resumo:
In this dissertation, the problem of creating effective large scale Adaptive Optics (AO) systems control algorithms for the new generation of giant optical telescopes is addressed. The effectiveness of AO control algorithms is evaluated in several respects, such as computational complexity, compensation error rejection and robustness, i.e. reasonable insensitivity to the system imperfections. The results of this research are summarized as follows: 1. Robustness study of Sparse Minimum Variance Pseudo Open Loop Controller (POLC) for multi-conjugate adaptive optics (MCAO). The AO system model that accounts for various system errors has been developed and applied to check the stability and performance of the POLC algorithm, which is one of the most promising approaches for the future AO systems control. It has been shown through numerous simulations that, despite the initial assumption that the exact system knowledge is necessary for the POLC algorithm to work, it is highly robust against various system errors. 2. Predictive Kalman Filter (KF) and Minimum Variance (MV) control algorithms for MCAO. The limiting performance of the non-dynamic Minimum Variance and dynamic KF-based phase estimation algorithms for MCAO has been evaluated by doing Monte-Carlo simulations. The validity of simple near-Markov autoregressive phase dynamics model has been tested and its adequate ability to predict the turbulence phase has been demonstrated both for single- and multiconjugate AO. It has also been shown that there is no performance improvement gained from the use of the more complicated KF approach in comparison to the much simpler MV algorithm in the case of MCAO. 3. Sparse predictive Minimum Variance control algorithm for MCAO. The temporal prediction stage has been added to the non-dynamic MV control algorithm in such a way that no additional computational burden is introduced. It has been confirmed through simulations that the use of phase prediction makes it possible to significantly reduce the system sampling rate and thus overall computational complexity while both maintaining the system stable and effectively compensating for the measurement and control latencies.
Resumo:
All students in the United States of America are required to take science. But what if there is not a science, but in fact a number of sciences? Could every culture, perhaps every different grouping of people, create its own science? This report describes a preliminary survey, the goal of which is to improve the teaching of science at American Indian Opportunities and Industrialization Center in Minneapolis, Minnesota by beginning to understand the differences between Western and American Indian sciences.