978 resultados para Must -- Analysis
Resumo:
In the 1980s, government agencies sought to utilize research on drug use prevention to design media campaigns. Enlisting the assistance of the national media, several campaigns were designed and initiated to bring anti-drug use messages to adolescents in the form of public service advertising. This research explores the sources of information selected by adolescents in grades 7 through 12 and how the selection of media and other sources of information relate to drug use behavior and attitudes and perceptions related to risk/harm and disapproval of friends' drug-using activities.^ Data collected from 1989 to 1992 in the Miami Coalition School Survey provided a random selection of secondary school studies. The responses of these students were analyzed using multivariate statistical techniques.^ Although many of the students selected media as the source for most of their information on the effects of drugs on the people who use them, the selection of media was found to be positively related to alcohol use and negatively related to marijuana use. The selection of friends, brothers, or sisters was a statistically significant source for adolescents who smoke cigarettes, use alcohol or marijuana.^ The results indicate that the anti-drug use messages received by students may be canceled out by media messages perceived to advocate substance use and that a more persuasive source of information for adolescents may be friends and siblings. As federal reports suggest that the economic costs of drug abuse will reach an estimated $150 billion by 1997 if current trends continue, prevention policy that addresses the glamorization of substance use remains a national priority. Additionally, programs that advocate prevention within the peer cluster must be supported, as peers are an influential source for both inspiring and possibly preventing drug use behavior. ^
Resumo:
This study explored the topic of motivation for intermediate students combining both an objective criterion measure (i.e., standardized test scores) and the self-report of students on self-concept and value of reading. The purpose of this study was to examine how third grade reading achievement correlated with the motivation of fourth grade boys and girls, and, in turn, how motivation related to fourth grade reading achievement. The participants were fourth grade students (n=207) attending two public, elementary schools in Miami-Dade County who were of primarily Hispanic origin or descent. Data were collected using the Reading Survey portion of the Motivation to Read Profile (1996) which measures self-concept and value of reading in order to measure motivation and the Third and Fourth Grade Reading Florida Comprehensive Assessment Tests 2.0 (FCAT 2.0) to assess achievement. First, a one way Analysis of Variance (ANOVA) was conducted to determine whether motivation differed significantly between fourth grade boys and girls. Second, a path analysis was used to determine whether motivation mediated or moderated the association between FCAT 2.0 third and fourth grade scores. Results of the ANOVA indicated that motivation, as measured by the Motivation to Read Profile did not differ significantly by sex. Results from the path analysis indicated that the model was significant and that third grade FCAT 2.0 scores accounted for a significant amount of the variance in fourth grade FCAT 2.0 scores once motivation was entered. Results of the study demonstrated that motivation partially mediates, but does not moderate the relationship between FCAT 2.0 third and fourth grade scores. In conclusion, it can be determined that past student achievement for fourth grade students plays a role in current student achievement when motivation is also considered. It is therefore important in order to improve the quality of fourth grade student's current performance to take into account a student's motivation and past achievement. An effort must be made to address students' motivational needs whether through school wide programs or at the classroom level in addition or in conjunction with cognition. Future research on the effect of self-concept in reading achievement is recommended.
Resumo:
Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^
Resumo:
Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.
Resumo:
Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.
Resumo:
One of the major problems in the analysis of beams with Moment of Inertia varying along their length, is to find the Fixed End Moments, Stiffness, and Carry-Over Factors. In order to determine Fixed End Moments, it is necessary to consider the non-prismatic member as integrated by a large number of small sections with constant Moment of Inertia, and to find the M/EI values for each individual section. This process takes a lot of time from Designers and Structural Engineers. The object of this thesis is to design a computer program to simplify this repetitive process, obtaining rapidly and effectively the Final Moments and Shears in continuous non-prismatic Beams. For this purpose the Column Analogy and the Moment Distribution Methods of Professor Hardy Cross have been utilized as the principles toward the methodical computer solutions. The program has been specifically designed to analyze continuous beams of a maximum of four spans of any length, integrated by symmetrical members with rectangular cross sections and with rectilinear variation of the Moment of Inertia. Any load or combination of uniform and concentrated loads must be considered. Finally sample problems will be solved with the new Computer Program and with traditional systems, to determine the accuracy and applicability of the Program.
Resumo:
For primates, and other arboreal mammals, adopting suspensory locomotion represents one of the strategies an animal can use to prevent toppling off a thin support during arboreal movement and foraging. While numerous studies have reported the incidence of suspensory locomotion in a broad phylogenetic sample of mammals, little research has explored what mechanical transitions must occur in order for an animal to successfully adopt suspensory locomotion. Additionally, many primate species are capable of adopting a highly specialized form of suspensory locomotion referred to as arm-swinging, but few scenarios have been posited to explain how arm-swinging initially evolved. This study takes a comparative experimental approach to explore the mechanics of below branch quadrupedal locomotion in primates and other mammals to determine whether above and below branch quadrupedal locomotion represent neuromuscular mirrors of each other, and whether the patterns below branch quadrupedal locomotion are similar across taxa. Also, this study explores whether the nature of the flexible coupling between the forelimb and hindlimb observed in primates is a uniquely primate feature, and investigates the possibility that this mechanism could be responsible for the evolution of arm-swinging.
To address these research goals, kinetic, kinematic, and spatiotemporal gait variables were collected from five species of primate (Cebus capucinus, Daubentonia madagascariensis, Lemur catta, Propithecus coquereli, and Varecia variegata) walking quadrupedally above and below branches. Data from these primate species were compared to data collected from three species of non-primate mammals (Choloepus didactylus, Pteropus vampyrus, and Desmodus rotundus) and to three species of arm-swinging primate (Hylobates moloch, Ateles fusciceps, and Pygathrix nemaeus) to determine how varying forms of suspensory locomotion relate to each other and across taxa.
From the data collected in this study it is evident the specialized gait characteristics present during above branch quadrupedal locomotion in primates are not observed when walking below branches. Instead, gait mechanics closely replicate the characteristic walking patterns of non-primate mammals, with the exception that primates demonstrate an altered limb loading pattern during below branch quadrupedal locomotion, in which the forelimb becomes the primary propulsive and weight-bearing limb; a pattern similar to what is observed during arm-swinging. It is likely that below branch quadrupedal locomotion represents a “mechanical release” from the challenges of moving on top of thin arboreal supports. Additionally, it is possible, that arm-swinging could have evolved from an anatomically-generalized arboreal primate that began to forage and locomote below branches. During these suspensory bouts, weight would have been shifted away from the hindlimbs towards forelimbs, and as the frequency of these boats increased the reliance of the forelimb as the sole form of weight support would have also increased. This form of functional decoupling may have released the hindlimbs from their weight-bearing role during suspensory locomotion, and eventually arm-swinging would have replaced below branch quadrupedal locomotion as the primary mode of suspensory locomotion observed in some primate species. This study provides the first experimental evidence supporting the hypothetical link between below branch quadrupedal locomotion and arm-swinging in primates.
Resumo:
This thesis looks at how non-experts develop an opinion on climate change, and how those opinions could be changed by public discourse. I use Hubert Dreyfus’ account of skill acquisition to distinguish between experts and non-experts. I then use a combination of Walter Fisher’s narrative paradigm and the hermeneutics of Paul Ricœur to explore how non-experts form opinions, and how public narratives can provide a point of critique. In order to develop robust narratives, they must be financially realistic. I therefore consider the burgeoning field of environmental, social, and corporate governance (ESG) analysis as a way of informing realistic public narratives. I identify a potential problem with this approach: the Western assumptions of ESG analysis might make for public narratives that are not convincing to a non-Western audience. I then demonstrate how elements of the Chinese tradition, the Confucian, Neo-Confucian, and Daoist schools, as presented by David Hall and Roger Ames, can provide alternative assumptions to ESG analysis so that the public narratives will be more culturally adaptable. This research contributes to the discipline by bringing disparate traditions together in a unique way, into a practical project with a view towards applications. I conclude by considering avenues for further research.
Resumo:
Les langages de programmation typés dynamiquement tels que JavaScript et Python repoussent la vérification de typage jusqu’au moment de l’exécution. Afin d’optimiser la performance de ces langages, les implémentations de machines virtuelles pour langages dynamiques doivent tenter d’éliminer les tests de typage dynamiques redondants. Cela se fait habituellement en utilisant une analyse d’inférence de types. Cependant, les analyses de ce genre sont souvent coûteuses et impliquent des compromis entre le temps de compilation et la précision des résultats obtenus. Ceci a conduit à la conception d’architectures de VM de plus en plus complexes. Nous proposons le versionnement paresseux de blocs de base, une technique de compilation à la volée simple qui élimine efficacement les tests de typage dynamiques redondants sur les chemins d’exécution critiques. Cette nouvelle approche génère paresseusement des versions spécialisées des blocs de base tout en propageant de l’information de typage contextualisée. Notre technique ne nécessite pas l’utilisation d’analyses de programme coûteuses, n’est pas contrainte par les limitations de précision des analyses d’inférence de types traditionnelles et évite la complexité des techniques d’optimisation spéculatives. Trois extensions sont apportées au versionnement de blocs de base afin de lui donner des capacités d’optimisation interprocédurale. Une première extension lui donne la possibilité de joindre des informations de typage aux propriétés des objets et aux variables globales. Puis, la spécialisation de points d’entrée lui permet de passer de l’information de typage des fonctions appellantes aux fonctions appellées. Finalement, la spécialisation des continuations d’appels permet de transmettre le type des valeurs de retour des fonctions appellées aux appellants sans coût dynamique. Nous démontrons empiriquement que ces extensions permettent au versionnement de blocs de base d’éliminer plus de tests de typage dynamiques que toute analyse d’inférence de typage statique.
Resumo:
Two water samples and two sediment samples taken in 1965 by the R. V. "Meteor" in the area of the hot salt brine of the Atlantis II-Deep were chemically investigated, and in addition the sediment samples were subjected to X-ray and optical analysis. The investigation of the sulfur-isotope-ratios showed the same values for all water samples. This information combined with the Ca-sulfate solubility data leads us to conclude that, for the most part, the sulfate content of the salt brine resulted from mixing along the boundary with the normal seawater. In this boundary area gypsum or anhydrite is formed which sinks down to the deeper layers of the salt brine where it is redisolved when the water becomes undersaturated. In the laboratory, formation of CaS04 precipitate resulted from both the reheating of the water sample from the uppermost zone of the salt brine to the in-situ-temperature as well as by the mixing of the water sample with normal Red Sea water. The iron and manganese delivered by the hot spring is separated within the area of the salt brine by their different redox-potentials. Iron is sedimented to a high amount within the salt brine, while, as evidenced by its small amounts in all sediment samples, the more easily reducible manganese is apparently carried out of the area before sedimentation can take place. The very good layering of the salt brine may be the result of the rough bottom topography with its several progressively higher levels allowing step-like enlargements of the surface areas of each successive layer. Each enlargement results in larger boundary areas along which more effective heat transfer and mixing with the next layer is possible. In the sediment samples up to 37.18% Fe is found, mostly bound as very poorly crystallized iron hydroxide. Pyrite is present in only very small amounts. We assume that the copper is bound mostly as sulfide, while the zinc is most likely present in an other form. The sulfur-isotope-investigations indicate that the sulfur in the sediment, bound as pyrite and sulfides, is not a result of bacterical sulfate-reduction in the iron-rich mud of the Atlantis II-Deep, but must have been brought up with the hot brine.
Resumo:
The hydrologic system beneath the Antarctic Ice Sheet is thought to influence both the dynamics and distribution of fast flowing ice streams, which discharge most of the ice lost by the ice sheet. Despite considerable interest in understanding this subglacial network and its affect on ice flow, in situ observations from the ice sheet bed are exceedingly rare. Here we describe the first sediment cores recovered from an active subglacial lake. The lake, known as Subglacial Lake Whillans, is part of a broader, dynamic hydrologic network beneath the Whillans Ice Stream in West Antarctica. Even though "floods" pass through the lake, the lake floor shows no evidence of erosion or deposition by flowing water. By inference, these floods must have insufficient energy to erode or transport significant volumes of sediment coarser than silt. Consequently, water flow beneath the region is probably incapable of incising continuous channels into the bed and instead follows preexisting subglacial topography and surface slope. Sediment on the lake floor consists of till deposited during intermittent grounding of the ice stream following flood events. The fabrics within the till are weaker than those thought to develop in thick deforming beds suggesting subglacial sediment fluxes across the ice plain are currently low and unlikely to have a large stabilizing effect on the ice stream's grounding zone.
Resumo:
For the SNO+ neutrinoless double beta decay search, various backgrounds, ranging from impurities present naturally to those produced cosmogenically, must be understood and reduced. Cosmogenic backgrounds are particularly difficult to reduce as they are continually regenerated while exposed to high energy cosmic rays. To reduce these cosmogenics as much as possible the tellurium used for the neutrinoless double beta decay search will be purified underground. An analysis of the purification factors achievable for insoluble cosmogenic impurities found a reduction factor of $>$20.4 at 50\% C.L.. During the purification process the tellurium will come into contact with ultra pure water and nitric acid. These liquids both carry some cosmogenic impurities with them that could be potentially transferred to the tellurium. A conservative limit is set at $<$18 events in the SNO+ region of interest (ROI) per year as a result of contaminants from these liquids. In addition to cosmogenics brought underground, muons can produce radioactive isotopes while the tellurium is stored underground. A study on the rate at which muons produce these backgrounds finds an additional 1 event per year. In order to load the tellurium into the detector, it will be combined with 1,2-butanediol to form an organometallic complex. The complex was found to have minimal effect on the SNO+ acrylic vessel for 154 years.
Resumo:
Economic policy-making has long been more integrated than social policy-making in part because the statistics and much of the analysis that supports economic policy are based on a common conceptual framework – the system of national accounts. People interested in economic analysis and economic policy share a common language of communication, one that includes both concepts and numbers. This paper examines early attempts to develop a system of social statistics that would mirror the system of national accounts, particular the work on the development of social accounts that took place mainly in the 60s and 70s. It explores the reasons why these early initiatives failed but argues that the preconditions now exist to develop a new conceptual framework to support integrated social statistics – and hence a more coherent, effective social policy. Optimism is warranted for two reasons. First, we can make use of the radical transformation that has taken place in information technology both in processing data and in providing wide access to the knowledge that can flow from the data. Second, the conditions exist to begin to shift away from the straight jacket of government-centric social statistics, with its implicit assumption that governments must be the primary actors in finding solutions to social problems. By supporting the decision-making of all the players (particularly individual citizens) who affect social trends and outcomes, we can start to move beyond the sterile, ideological discussions that have dominated much social discourse in the past and begin to build social systems and structures that evolve, almost automatically, based on empirical evidence of ‘what works best for whom’. The paper describes a Canadian approach to developing a framework, or common language, to support the evolution of an integrated, citizen-centric system of social statistics and social analysis. This language supports the traditional social policy that we have today; nothing is lost. However, it also supports a quite different social policy world, one where individual citizens and families (not governments) are seen as the central players – a more empirically-driven world that we have referred to as the ‘enabling society’.
Resumo:
This project is about Fast and Female, a community-based girls’ sport organization, that focuses on empowering girls through sport. In this thesis I produce a discourse analysis from interviews with six expert sportswomen and a textual analysis of the organization’s online content – including its social media pages. I ground my analysis in poststructural theory as explained by Chris Weedon (1997) and in literature that helps contextualize and better define empowerment (Collins, 2000; Cruikshank, 1999; Hains, 2012; Sharma, 2008; Simon, 1994) and neoliberalism (Silk & Andrews, 2012). My analysis in this project suggests that Fast and Female develops a community through online and in-person interaction. This community is focused on girls’ sport and empowerment, but, as the organization is situated in a neoliberal context, organizers must take extra consideration in order for the organization to develop a girls’ sport culture that is truly representative of the desires and needs of the participants rather than implicit neoliberal values. It is important to note that Fast and Female does not identify as a feminist organization. Through this thesis I argue that Fast and Female teaches girls that sport is empowering – but, while the organization draws on “empowerment,” a term often used by feminists, it promotes a notion of empowerment that teaches female athletes how to exist within current mainstream and sporting cultures, rather than encouraging them to be empowered female citizens who learn to question and challenge social inequity. I conclude my thesis with suggestions for Fast and Female to encourage empowerment in spite of the current neoliberal situation. I also offer a goal-setting workbook that I developed to encourage girls to set goals while thinking about their communities rather than just themselves.
Resumo:
During the last twenty years (1995-2015), the world of commerce has expanded beyond the traditional brick-and-mortar high street to a global shop front accessible to billions of users via the Worldwide Web (WWW). Consumers are now using the web to immerse themselves in virtual shop fronts, using Social Media (SM) to communicate and share product ideas with friends and family. Retail organisations recognise the need to develop and adapt their strategies to respond to the increasing use of SM. New goals must be set in order to identify how companies will integrate social media into current practices. This research aims to suggest an advisable and comprehensive SM strategy for companies operating in the global retail sector, based on an exploratory analysis of three multi-national retail organisations' existing SM strategies. This will be assessed in conjunction with a broader investigation into social media in the retail industry. From this, a strategy will be devised to improve internal and external communication as well as knowledge management through the use of social media. Findings suggest that the use of SM within the retail industry has dramatically improved collaboration and communication processes for organisations as they are now able to converse better with stakeholders and the tools are relatively simple to integrate and implement as they benefit one another.