921 resultados para Masculinity in performance
Resumo:
The high gains in performance predicted for optical immersion are difficult to achieve in practice due to total internal reflection at the lens/detector interface. By reducing the air gap at this interface optical tunneling becomes possible and the predicted gains can be realized in practical devices. Using this technique we have demonstrated large performance gains by optically immersing mid-infrared heterostructure InA1Sb LEDs and photodiodes using hypershperical germanium lenses. The development of an effective method of optical immersion that gives excellent optical coupling has produced a photodiode with a peak room temperature detectivity (D*) of 5.3 x 109 cmHz½W-1 at λpeak=5.4μm and a 40° field of view. A hyperspherically immersed LED showed a f-fold improvement in the external efficiency, and a 3-fold improvement in the directionality compared with a conventional planar LED for f/2 optical systems. The incorporation of these uncooled devices in a White cell produced a NO2 gas sensing system with 2 part-per-million sensitivity, with an LED drive current of <5mA. These results represent a significant advance in the use of solid state devices for portable gas sensing systems.
Resumo:
Complex Event processing (CEP) has emerged over the last ten years. CEP systems are outstanding in processing large amount of data and responding in a timely fashion. While CEP applications are fast growing, performance management in this area has not gain much attention. It is critical to meet the promised level of service for both system designers and users. In this paper, we present a benchmark for complex event processing systems: CEPBen. The CEPBen benchmark is designed to evaluate CEP functional behaviours, i.e., filtering, transformation and event pattern detection and provides a novel methodology of evaluating the performance of CEP systems. A performance study by running the CEPBen on Esper CEP engine is described and discussed. The results obtained from performance tests demonstrate the influences of CEP functional behaviours on the system performance. © 2014 Springer International Publishing Switzerland.
Resumo:
In the teletraffic engineering of all the telecommunication networks, parameters characterizing the terminal traffic are used. One of the most important of them is the probability of finding the called (B-terminal) busy. This parameter is studied in some of the first and last papers in Teletraffic Theory. We propose a solution in this topic in the case of (virtual) channel systems, such as PSTN and GSM. We propose a detailed conceptual traffic model and, based on it, an analytical macro-state model of the system in stationary state, with: Bernoulli– Poisson–Pascal input flow; repeated calls; limited number of homogeneous terminals; losses due to abandoned and interrupted dialling, blocked and interrupted switching, not available intent terminal, blocked and abandoned ringing and abandoned conversation. Proposed in this paper approach may help in determination of many network traffic characteristics at session level, in performance evaluation of the next generation mobile networks.
Resumo:
AMS subject classification: 68Q22, 90C90
Resumo:
This study was an evaluation of a Field Project Model Curriculum and its impact on achievement, attitude toward science, attitude toward the environment, self-concept, and academic self-concept with at-risk eleventh and twelfth grade students. One hundred eight students were pretested and posttested on the Piers-Harris Children's Self-Concept Scale, PHCSC (1985); the Self-Concept as a Learner Scale, SCAL (1978); the Marine Science Test, MST (1987); the Science Attitude Inventory, SAI (1970); and the Environmental Attitude Scale, EAS (1972). Using a stratified random design, three groups of students were randomly assigned according to sex and stanine level, to three treatment groups. Group one received the field project method, group two received the field study method, and group three received the field trip method. All three groups followed the marine biology course content as specified by Florida Student Performance Objectives and Frameworks. The intervention occurred for ten months with each group participating in outside-of-classroom activities on a trimonthly basis. Analysis of covariance procedures were used to determine treatment effects. F-ratios, p-levels and t-tests at p $<$.0062 (.05/8) indicated that a significant difference existed among the three treatment groups. Findings indicated that groups one and two were significantly different from group three with group one displaying significantly higher results than group two. There were no significant differences between males and females in performance on the five dependent variables. The tenets underlying environmental education are congruent with the recommendations toward the reform of science education. These include a value analysis approach, inquiry methods, and critical thinking strategies that are applied to environmental issues. ^
Resumo:
This study explored Taiwanese technological higher education administrators' perceptions about the motivation and capability of their institutions to form intercollegiate alliance, their preferred areas of collaboration, and their preferred partner attributes. Possible differences in perceptions of administrators from public and private institutions were also explored. ^ The study targeted six chief administrators in each of 88 technological and vocational higher education institutions in Taiwan. A mix of quantitative and qualitative research designs was used to collect and analyze data. Quantitative data were collected from 328 administrators through a questionnaire and analyzed using univariate and multivariate statistical techniques. In addition, to obtain a deeper understanding of the process of alliance formation, qualitative data were collected through interviews with 13 administrators and content analyzed using emergent themes analysis. ^ Findings revealed that Taiwanese technological education administrators were not strongly confident in the competitive positions of their institutions. They perceived themselves as non-competitive in faculty research performance, in getting financial support, and having easy-access locations. Administrators believed that forming an alliance would help them obtain more external resources, achieve academic enhancement, provide better services, have a stronger voice, and obtain promotion to a higher institutional level. Cost cutting was not believed to be an attainable goal. ^ Strong interest was expressed for an alliance in the sharing of technology, information networks, and library resources; cross-registration; admissions and recruitment practices; school-industry endeavors; and international academic exchanges. Sharing of administrators and staff, joint bidding and purchasing, and cooperative fundraising were considered of less interest. ^ Administrators favored partners who have excellent academic programs, who have complementary skills, who are willing to share resources, and who are enthusiastic leaders. They also wanted partners to match their institutions in performance and prestige and to be geographically close to them. ^ Multivariate analysis of variance did not reveal significant differences between the perceptions of the administrators from public and private institutions. It was concluded that despite governmental encouragement and the institutions' eagerness for forming an alliance, the administrators had little confidence that a sustainable alliance could be arranged. ^
Resumo:
The purpose of this study was to determine if an experimental context-based delivery format for mathematics would be more effective than a traditional model for increasing the performance in mathematics of at-risk students in a public high school of choice, as evidenced by significant gains in achievement on the standards-based Mathematics subtest of the FCAT and final academic grades in Algebra I. The guiding rationale for this approach is captured in the Secretary's Commission on Achieving Necessary Skills (SCANS) report of 1992 that resulted in school-to-work initiatives (United States Department of Labor). Also, the charge for educational reform has been codified at the state level as Educational Accountability Act of 1971 (Florida Statutes, 1995) and at the national level as embodied in the No Child Left Behind Act of 2001. A particular focus of educational reform is low performing, at-risk students. ^ This dissertation explored the effects of a context-based curricular reform designed to enhance the content of Algebra I content utilizing a research design consisting of two delivery models: a traditional content-based course; and, a thematically structured, content-based course. In this case, the thematic element was business education as there are many advocates in career education who assert that this format engages students who are often otherwise disinterested in mathematics in a relevant, SCANS skills setting. The subjects in each supplementary course were ninth grade students who were both low performers in eighth grade mathematics and who had not passed the eighth grade administration of the standards-based FCAT Mathematics subtest. The sample size was limited to two groups of 25 students and two teachers. The site for this study was a public charter school. Student-generated performance data were analyzed using descriptive statistics. ^ Results indicated that contrary to the beliefs held by many, contextual presentation of content did not cause significant gains in either academic performance or test performance for those in the experimental treatment group. Further, results indicated that there was no meaningful difference in performance between the two groups. ^
Resumo:
Numerical optimization is a technique where a computer is used to explore design parameter combinations to find extremes in performance factors. In multi-objective optimization several performance factors can be optimized simultaneously. The solution to multi-objective optimization problems is not a single design, but a family of optimized designs referred to as the Pareto frontier. The Pareto frontier is a trade-off curve in the objective function space composed of solutions where performance in one objective function is traded for performance in others. A Multi-Objective Hybridized Optimizer (MOHO) was created for the purpose of solving multi-objective optimization problems by utilizing a set of constituent optimization algorithms. MOHO tracks the progress of the Pareto frontier approximation development and automatically switches amongst those constituent evolutionary optimization algorithms to speed the formation of an accurate Pareto frontier approximation. Aerodynamic shape optimization is one of the oldest applications of numerical optimization. MOHO was used to perform shape optimization on a 0.5-inch ballistic penetrator traveling at Mach number 2.5. Two objectives were simultaneously optimized: minimize aerodynamic drag and maximize penetrator volume. This problem was solved twice. The first time the problem was solved by using Modified Newton Impact Theory (MNIT) to determine the pressure drag on the penetrator. In the second solution, a Parabolized Navier-Stokes (PNS) solver that includes viscosity was used to evaluate the drag on the penetrator. The studies show the difference in the optimized penetrator shapes when viscosity is absent and present in the optimization. In modern optimization problems, objective function evaluations may require many hours on a computer cluster to perform these types of analysis. One solution is to create a response surface that models the behavior of the objective function. Once enough data about the behavior of the objective function has been collected, a response surface can be used to represent the actual objective function in the optimization process. The Hybrid Self-Organizing Response Surface Method (HYBSORSM) algorithm was developed and used to make response surfaces of objective functions. HYBSORSM was evaluated using a suite of 295 non-linear functions. These functions involve from 2 to 100 variables demonstrating robustness and accuracy of HYBSORSM.
Resumo:
Prior research has established that idiosyncratic volatility of the securities prices exhibits a positive trend. This trend and other factors have made the merits of investment diversification and portfolio construction more compelling. ^ A new optimization technique, a greedy algorithm, is proposed to optimize the weights of assets in a portfolio. The main benefits of using this algorithm are to: (a) increase the efficiency of the portfolio optimization process, (b) implement large-scale optimizations, and (c) improve the resulting optimal weights. In addition, the technique utilizes a novel approach in the construction of a time-varying covariance matrix. This involves the application of a modified integrated dynamic conditional correlation GARCH (IDCC - GARCH) model to account for the dynamics of the conditional covariance matrices that are employed. ^ The stochastic aspects of the expected return of the securities are integrated into the technique through Monte Carlo simulations. Instead of representing the expected returns as deterministic values, they are assigned simulated values based on their historical measures. The time-series of the securities are fitted into a probability distribution that matches the time-series characteristics using the Anderson-Darling goodness-of-fit criterion. Simulated and actual data sets are used to further generalize the results. Employing the S&P500 securities as the base, 2000 simulated data sets are created using Monte Carlo simulation. In addition, the Russell 1000 securities are used to generate 50 sample data sets. ^ The results indicate an increase in risk-return performance. Choosing the Value-at-Risk (VaR) as the criterion and the Crystal Ball portfolio optimizer, a commercial product currently available on the market, as the comparison for benchmarking, the new greedy technique clearly outperforms others using a sample of the S&P500 and the Russell 1000 securities. The resulting improvements in performance are consistent among five securities selection methods (maximum, minimum, random, absolute minimum, and absolute maximum) and three covariance structures (unconditional, orthogonal GARCH, and integrated dynamic conditional GARCH). ^
Resumo:
In the United States, public school enrollment is typically organized by neighborhood boundaries. This dissertation examines whether the federally funded HOPE VI program influenced performance in neighborhood public schools. In effect since 1992, HOPE VI has sought to revitalize distressed public housing using the New Urbanism model of mixed income communities. There are 165 such HOPE VI projects nationwide. Despite nearly two decades of the program's implementation, the literature on its connection to public school performance is thin. My dissertation aims to narrow this research gap. There are three principal research questions: (1) Following HOPE VI, was there a change in socioeconomic status (SES) in the neighborhood public school? The hypothesis is that low SES (measured as the proportion of students qualifying for the Free and Reduced Lunch Program) would reduce. (2) Following HOPE VI, did the performance of neighborhood public schools change? The hypothesis is that the school performance, measured by the proportion of 5th grade students proficient in state wide math and reading tests, would increase. (3) What factors relate to the performance of public schools in HOPE VI communities? The focus is on non-school, neighborhood factors that influence the public school performance. For answering the first two questions, I used t-tests and regression models to test the hypotheses. The analysis shows that there is no statistically significant change in SES following HOPE VI. However, there are statistically significant increases in performance for reading and math proficiency. The results are interesting in indicating that HOPE VI neighborhood improvement may have some relationship with improving school performance. To answer the third question, I conducted a case study analysis of two HOPE VI neighborhood public schools, one which improved significantly (in Philadelphia) and one which declined the most (in Washington DC). The analysis revealed three insights into neighborhood factors for improved school performance: (i) a strong local community organization; (ii) local community's commitment (including the middle income families) to send children to the public school; and (iii) ties between housing and education officials to implement the federal housing program. In essence, the study reveals how housing policy is de facto education policy.
Resumo:
This dissertation presents an investigation of the evolutionary process of extended oboe techniques, through literary analysis and practical research. The objective of this work is to provide assistance to oboists interested in learning these techniques. Additionally, this work encourages the student, through the process of experimentation, to explore the questions that may arise around the aesthetics of sound, the concept of gesture as an additional visual and aural element in music, and the collaboration and “real-time” creation processes. Discussed within the work, are the relationship between the instrument (the oboe) and extended techniques, and two possible definitions of extended techniques, provided by Luk Vaes (2009) and Gardner Read (1993). Also explored are the how and why some composers have utilized extended techniques in their compositions, including brief discussions relating to extended techniques in real-time composition (improvisation), extended techniques and technological resources, theatrical gesture as an extended technique, and suggestions of how musicians might approach theatrical gestures in performance. Four works were visited: “I Know This Room So Well” – Lisa Bielawa (2007-9); “Four Pieces for Oboe and Piano” – Ernst Krenek (1966); “In Freundschaft” – Karlheinz Stockhausen (1978); “Atem” – Mauricio Kagel (1969-70); and an exploration of the difficulties and solutions associated with each extended technique found within these pieces, was carried out. The following founding works on extended oboe techniques were used, as a basis for research: books - Heinz Holliger’s Pro Musica Nova (1972); Gardner Read’s Compendium of Modern Instrumental Techniques (1993); Peter Veale & Claus-Steffen Mahnkopf’s The Techniques of Oboe Playing (1994); and Libby Van Cleve’s Oboe Unbound: Contemporary Techniques (2004); and articles - Nora Post’s “Monophonic sound resources for the oboe: Part I – Timbre” (1984), “Part II- Pitch and other techniques” (1984), and “Multiphonics for the oboe” (1982).
Resumo:
Perception of simultaneity and temporal order is studied with simultaneity judgment (SJ) and temporal-order judgment (TOJ) tasks. In the former, observers report whether presentation of two stimuli was subjectively simultaneous; in the latter, they report which stimulus was subjectively presented first. SJ and TOJ tasks typically give discrepant results, which has prompted the view that performance is mediated by different processes in each task. We looked at these discrepancies from a model that yields psychometric functions whose parameters characterize the timing, decisional, and response processes involved in SJ and TOJ tasks. We analyzed 12 data sets from published studies in which both tasks had been used in within-subjects designs, all of which had reported differences in performance across tasks. Fitting the model jointly to data from both tasks, we tested the hypothesis that common timing processes sustain simultaneity and temporal order judgments, with differences in performance arising from task-dependent decisional and response processes. The results supported this hypothesis, also showing that model psychometric functions account for aspects of SJ and TOJ data that classical analyses overlook. Implications for research on perception of simultaneity and temporal order are discussed.
Resumo:
Atomic ions trapped in micro-fabricated surface traps can be utilized as a physical platform with which to build a quantum computer. They possess many of the desirable qualities of such a device, including high fidelity state preparation and readout, universal logic gates, long coherence times, and can be readily entangled with each other through photonic interconnects. The use of optical cavities integrated with trapped ion qubits as a photonic interface presents the possibility for order of magnitude improvements in performance in several key areas of their use in quantum computation. The first part of this thesis describes the design and fabrication of a novel surface trap for integration with an optical cavity. The trap is custom made on a highly reflective mirror surface and includes the capability of moving the ion trap location along all three trap axes with nanometer scale precision. The second part of this thesis demonstrates the suitability of small micro-cavities formed from laser ablated fused silica substrates with radii of curvature in the 300-500 micron range for use with the mirror trap as part of an integrated ion trap cavity system. Quantum computing applications for such a system include dramatic improvements in the photonic entanglement rate up to 10 kHz, the qubit measurement time down to 1 microsecond, and the measurement error rates down to the 10e-5 range. The final part of this thesis details a performance simulator for exploring the physical resource requirements and performance demands to scale such a quantum computer to sizes capable of performing quantum algorithms beyond the limits of classical computation.
Resumo:
This dissertation presents the first theoretical model for understanding narration and point of view in opera, examining repertoire from Richard Wagner to Benjamin Britten. Prior music scholarship on musical narratives and narrativity has drawn primarily on continental literary theory and philosophy of the 1960s to the middle of the 1980s. This study, by contrast, engages with current debates in the analytic branch of aesthetic philosophy. One reason why the concept of point of view has not been more extensively explored in opera studies is the widespread belief that operas are not narratives. This study questions key premises on which this assumption rests. In so doing, it presents a new definition of narrative. Arguably, a narrative is an utterance intended to communicate a story, where "story" is understood to involve the representation of a particular agent or agents exercising their agency. This study explores the role of narrators in opera, introducing the first taxonomy of explicit fictional operatic narrators. Through a close analysis of Britten and Myfanwy Piper's Owen Wingrave, it offers an explanation of music's power to orient spectators to the points of view of opera characters by providing audiences with access to characters' perceptual experiences and cognitive, affective, and psychological states. My analysis also helps account for how our subjective access to fictional characters may engender sympathy for them. The second half of the dissertation focuses on opera in performance. Current thinking in music scholarship predominantly holds that fidelity is an outmoded concern. I argue that performing a work-for-performance is a matter of intentionally modelling one's performance on the work-for-performance's features and achieving a moderate degree of fidelity or matching between the two. Finally, this study investigates how the creative decisions of the performers and director impact the point of view from which an opera is told.
Resumo:
A substantial amount of information on the Internet is present in the form of text. The value of this semi-structured and unstructured data has been widely acknowledged, with consequent scientific and commercial exploitation. The ever-increasing data production, however, pushes data analytic platforms to their limit. This thesis proposes techniques for more efficient textual big data analysis suitable for the Hadoop analytic platform. This research explores the direct processing of compressed textual data. The focus is on developing novel compression methods with a number of desirable properties to support text-based big data analysis in distributed environments. The novel contributions of this work include the following. Firstly, a Content-aware Partial Compression (CaPC) scheme is developed. CaPC makes a distinction between informational and functional content in which only the informational content is compressed. Thus, the compressed data is made transparent to existing software libraries which often rely on functional content to work. Secondly, a context-free bit-oriented compression scheme (Approximated Huffman Compression) based on the Huffman algorithm is developed. This uses a hybrid data structure that allows pattern searching in compressed data in linear time. Thirdly, several modern compression schemes have been extended so that the compressed data can be safely split with respect to logical data records in distributed file systems. Furthermore, an innovative two layer compression architecture is used, in which each compression layer is appropriate for the corresponding stage of data processing. Peripheral libraries are developed that seamlessly link the proposed compression schemes to existing analytic platforms and computational frameworks, and also make the use of the compressed data transparent to developers. The compression schemes have been evaluated for a number of standard MapReduce analysis tasks using a collection of real-world datasets. In comparison with existing solutions, they have shown substantial improvement in performance and significant reduction in system resource requirements.